Mar 12 12:19:14.887419 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 12 12:19:15.517156 master-0 kubenswrapper[4102]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:19:15.517156 master-0 kubenswrapper[4102]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 12:19:15.517156 master-0 kubenswrapper[4102]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:19:15.518348 master-0 kubenswrapper[4102]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:19:15.518348 master-0 kubenswrapper[4102]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 12 12:19:15.518348 master-0 kubenswrapper[4102]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:19:15.519638 master-0 kubenswrapper[4102]: I0312 12:19:15.519414 4102 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526151 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526217 4102 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526227 4102 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526236 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526246 4102 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526255 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:19:15.526240 master-0 kubenswrapper[4102]: W0312 12:19:15.526265 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526274 4102 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526287 4102 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526299 4102 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526308 4102 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526317 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526325 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526336 4102 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526346 4102 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526356 4102 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526364 4102 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526372 4102 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526381 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526389 4102 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526396 4102 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526405 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526415 4102 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526424 4102 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526432 4102 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:19:15.526780 master-0 kubenswrapper[4102]: W0312 12:19:15.526440 4102 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526447 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526456 4102 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526467 4102 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526498 4102 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526506 4102 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526514 4102 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526522 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526531 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526538 4102 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526546 4102 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526553 4102 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526561 4102 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526569 4102 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526577 4102 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526586 4102 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526594 4102 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526601 4102 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526609 4102 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:19:15.527822 master-0 kubenswrapper[4102]: W0312 12:19:15.526616 4102 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526624 4102 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526632 4102 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526640 4102 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526649 4102 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526657 4102 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526665 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526673 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526680 4102 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526689 4102 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526696 4102 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526703 4102 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526712 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526719 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526727 4102 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526740 4102 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526750 4102 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526759 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526768 4102 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:19:15.529213 master-0 kubenswrapper[4102]: W0312 12:19:15.526776 4102 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526785 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526793 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526802 4102 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526811 4102 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526819 4102 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526828 4102 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526836 4102 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: W0312 12:19:15.526844 4102 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527037 4102 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527059 4102 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527074 4102 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527086 4102 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527098 4102 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527107 4102 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527119 4102 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527131 4102 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527141 4102 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527151 4102 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527161 4102 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527171 4102 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527193 4102 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 12:19:15.530223 master-0 kubenswrapper[4102]: I0312 12:19:15.527202 4102 flags.go:64] FLAG: --cgroup-root="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527211 4102 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527221 4102 flags.go:64] FLAG: --client-ca-file="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527230 4102 flags.go:64] FLAG: --cloud-config="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527238 4102 flags.go:64] FLAG: --cloud-provider="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527248 4102 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527259 4102 flags.go:64] FLAG: --cluster-domain="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527268 4102 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527277 4102 flags.go:64] FLAG: --config-dir="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527286 4102 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527297 4102 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527308 4102 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527317 4102 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527327 4102 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527337 4102 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527347 4102 flags.go:64] FLAG: --contention-profiling="false" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527356 4102 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527365 4102 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527377 4102 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527387 4102 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527398 4102 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527409 4102 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527418 4102 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527427 4102 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527436 4102 flags.go:64] FLAG: --enable-server="true" Mar 12 12:19:15.531352 master-0 kubenswrapper[4102]: I0312 12:19:15.527445 4102 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527458 4102 flags.go:64] FLAG: --event-burst="100" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527467 4102 flags.go:64] FLAG: --event-qps="50" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527500 4102 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527510 4102 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527520 4102 flags.go:64] FLAG: --eviction-hard="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527532 4102 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527541 4102 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527551 4102 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527562 4102 flags.go:64] FLAG: --eviction-soft="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527574 4102 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527585 4102 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527595 4102 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527604 4102 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527613 4102 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527622 4102 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527631 4102 flags.go:64] FLAG: --feature-gates="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527643 4102 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527676 4102 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527686 4102 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527696 4102 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527706 4102 flags.go:64] FLAG: --healthz-port="10248" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527715 4102 flags.go:64] FLAG: --help="false" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527725 4102 flags.go:64] FLAG: --hostname-override="" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527733 4102 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527743 4102 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 12:19:15.532625 master-0 kubenswrapper[4102]: I0312 12:19:15.527754 4102 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527763 4102 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527772 4102 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527782 4102 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527791 4102 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527800 4102 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527809 4102 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527818 4102 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527829 4102 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527839 4102 flags.go:64] FLAG: --kube-reserved="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527849 4102 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527858 4102 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527868 4102 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527877 4102 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527887 4102 flags.go:64] FLAG: --lock-file="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527896 4102 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527905 4102 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527916 4102 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527930 4102 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527940 4102 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527965 4102 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527974 4102 flags.go:64] FLAG: --logging-format="text" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527983 4102 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.527993 4102 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.528002 4102 flags.go:64] FLAG: --manifest-url="" Mar 12 12:19:15.533824 master-0 kubenswrapper[4102]: I0312 12:19:15.528011 4102 flags.go:64] FLAG: --manifest-url-header="" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528023 4102 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528032 4102 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528044 4102 flags.go:64] FLAG: --max-pods="110" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528055 4102 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528064 4102 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528073 4102 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528083 4102 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528092 4102 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528101 4102 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528111 4102 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528132 4102 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528141 4102 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528151 4102 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528160 4102 flags.go:64] FLAG: --pod-cidr="" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528169 4102 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528183 4102 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528192 4102 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528201 4102 flags.go:64] FLAG: --pods-per-core="0" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528210 4102 flags.go:64] FLAG: --port="10250" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528219 4102 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528229 4102 flags.go:64] FLAG: --provider-id="" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528238 4102 flags.go:64] FLAG: --qos-reserved="" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528247 4102 flags.go:64] FLAG: --read-only-port="10255" Mar 12 12:19:15.534924 master-0 kubenswrapper[4102]: I0312 12:19:15.528257 4102 flags.go:64] FLAG: --register-node="true" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528267 4102 flags.go:64] FLAG: --register-schedulable="true" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528276 4102 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528293 4102 flags.go:64] FLAG: --registry-burst="10" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528303 4102 flags.go:64] FLAG: --registry-qps="5" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528312 4102 flags.go:64] FLAG: --reserved-cpus="" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528321 4102 flags.go:64] FLAG: --reserved-memory="" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528334 4102 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528344 4102 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528353 4102 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528363 4102 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528372 4102 flags.go:64] FLAG: --runonce="false" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528381 4102 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528390 4102 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528400 4102 flags.go:64] FLAG: --seccomp-default="false" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528409 4102 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528417 4102 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528427 4102 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528436 4102 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528446 4102 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528455 4102 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528464 4102 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528473 4102 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528517 4102 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528527 4102 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 12:19:15.536175 master-0 kubenswrapper[4102]: I0312 12:19:15.528536 4102 flags.go:64] FLAG: --system-cgroups="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528545 4102 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528560 4102 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528569 4102 flags.go:64] FLAG: --tls-cert-file="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528578 4102 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528590 4102 flags.go:64] FLAG: --tls-min-version="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528599 4102 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528611 4102 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528620 4102 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528630 4102 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528639 4102 flags.go:64] FLAG: --v="2" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528657 4102 flags.go:64] FLAG: --version="false" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528668 4102 flags.go:64] FLAG: --vmodule="" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528679 4102 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: I0312 12:19:15.528689 4102 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528899 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528908 4102 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528917 4102 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528927 4102 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528936 4102 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528945 4102 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528952 4102 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528960 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:19:15.537701 master-0 kubenswrapper[4102]: W0312 12:19:15.528968 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.528976 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.528986 4102 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.528996 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529005 4102 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529015 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529023 4102 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529033 4102 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529043 4102 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529052 4102 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529061 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529069 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529078 4102 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529087 4102 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529096 4102 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529104 4102 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529113 4102 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529121 4102 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529138 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:19:15.539107 master-0 kubenswrapper[4102]: W0312 12:19:15.529146 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529155 4102 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529163 4102 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529172 4102 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529180 4102 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529188 4102 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529196 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529203 4102 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529214 4102 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529225 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529236 4102 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529246 4102 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529261 4102 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529269 4102 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529278 4102 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529286 4102 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529294 4102 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529302 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529310 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529318 4102 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:19:15.540343 master-0 kubenswrapper[4102]: W0312 12:19:15.529326 4102 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529333 4102 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529341 4102 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529349 4102 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529421 4102 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529431 4102 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529440 4102 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529447 4102 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529455 4102 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529462 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529470 4102 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529507 4102 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529515 4102 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529522 4102 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529530 4102 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529539 4102 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529548 4102 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529556 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529565 4102 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529572 4102 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:19:15.541654 master-0 kubenswrapper[4102]: W0312 12:19:15.529584 4102 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:19:15.542800 master-0 kubenswrapper[4102]: W0312 12:19:15.529592 4102 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:19:15.542800 master-0 kubenswrapper[4102]: W0312 12:19:15.529600 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:19:15.542800 master-0 kubenswrapper[4102]: W0312 12:19:15.529608 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:19:15.542800 master-0 kubenswrapper[4102]: W0312 12:19:15.529616 4102 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:19:15.542800 master-0 kubenswrapper[4102]: I0312 12:19:15.530446 4102 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:19:15.544776 master-0 kubenswrapper[4102]: I0312 12:19:15.544685 4102 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 12 12:19:15.544776 master-0 kubenswrapper[4102]: I0312 12:19:15.544766 4102 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 12:19:15.544992 master-0 kubenswrapper[4102]: W0312 12:19:15.544944 4102 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:19:15.544992 master-0 kubenswrapper[4102]: W0312 12:19:15.544978 4102 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:19:15.544992 master-0 kubenswrapper[4102]: W0312 12:19:15.544987 4102 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:19:15.544992 master-0 kubenswrapper[4102]: W0312 12:19:15.544998 4102 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545009 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545019 4102 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545027 4102 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545037 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545045 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545054 4102 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545061 4102 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545069 4102 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545077 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545086 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545094 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545102 4102 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545110 4102 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545120 4102 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545134 4102 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545142 4102 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545150 4102 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545158 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545166 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:19:15.545205 master-0 kubenswrapper[4102]: W0312 12:19:15.545173 4102 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545182 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545190 4102 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545198 4102 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545206 4102 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545214 4102 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545224 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545234 4102 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545246 4102 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545256 4102 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545265 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545274 4102 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545283 4102 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545291 4102 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545299 4102 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545307 4102 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545315 4102 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545323 4102 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545333 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545343 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:19:15.546349 master-0 kubenswrapper[4102]: W0312 12:19:15.545353 4102 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545367 4102 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545381 4102 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545394 4102 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545405 4102 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545419 4102 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545432 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545448 4102 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545459 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545469 4102 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545506 4102 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545515 4102 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545523 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545534 4102 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545546 4102 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545554 4102 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545564 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545574 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545584 4102 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:19:15.547791 master-0 kubenswrapper[4102]: W0312 12:19:15.545594 4102 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545605 4102 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545616 4102 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545626 4102 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545640 4102 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545654 4102 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545664 4102 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545674 4102 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545685 4102 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.545695 4102 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: I0312 12:19:15.545713 4102 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.548601 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.548644 4102 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.548656 4102 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.548669 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:19:15.549287 master-0 kubenswrapper[4102]: W0312 12:19:15.548679 4102 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548689 4102 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548699 4102 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548709 4102 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548720 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548730 4102 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548740 4102 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548750 4102 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548760 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548769 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548779 4102 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548789 4102 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548799 4102 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548809 4102 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548820 4102 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548835 4102 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548849 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548859 4102 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548870 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548880 4102 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:19:15.550040 master-0 kubenswrapper[4102]: W0312 12:19:15.548891 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548902 4102 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548911 4102 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548921 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548930 4102 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548940 4102 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548950 4102 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548963 4102 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548977 4102 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.548987 4102 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549000 4102 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549010 4102 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549020 4102 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549030 4102 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549039 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549049 4102 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549059 4102 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549072 4102 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549086 4102 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:19:15.551043 master-0 kubenswrapper[4102]: W0312 12:19:15.549097 4102 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549107 4102 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549119 4102 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549130 4102 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549140 4102 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549152 4102 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549162 4102 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549171 4102 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549181 4102 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549237 4102 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549251 4102 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549262 4102 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549273 4102 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549290 4102 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549302 4102 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549314 4102 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549325 4102 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549336 4102 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549346 4102 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549357 4102 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:19:15.552107 master-0 kubenswrapper[4102]: W0312 12:19:15.549367 4102 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549381 4102 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549395 4102 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549406 4102 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549417 4102 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549428 4102 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549438 4102 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549449 4102 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: W0312 12:19:15.549460 4102 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: I0312 12:19:15.549513 4102 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:19:15.553062 master-0 kubenswrapper[4102]: I0312 12:19:15.549920 4102 server.go:940] "Client rotation is on, will bootstrap in background" Mar 12 12:19:15.557193 master-0 kubenswrapper[4102]: I0312 12:19:15.557129 4102 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 12 12:19:15.558752 master-0 kubenswrapper[4102]: I0312 12:19:15.558700 4102 server.go:997] "Starting client certificate rotation" Mar 12 12:19:15.558835 master-0 kubenswrapper[4102]: I0312 12:19:15.558752 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 12 12:19:15.559024 master-0 kubenswrapper[4102]: I0312 12:19:15.558961 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 12:19:15.588317 master-0 kubenswrapper[4102]: I0312 12:19:15.588223 4102 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:19:15.592084 master-0 kubenswrapper[4102]: E0312 12:19:15.591985 4102 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:15.593305 master-0 kubenswrapper[4102]: I0312 12:19:15.593224 4102 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:19:15.622042 master-0 kubenswrapper[4102]: I0312 12:19:15.621958 4102 log.go:25] "Validated CRI v1 runtime API" Mar 12 12:19:15.628939 master-0 kubenswrapper[4102]: I0312 12:19:15.628866 4102 log.go:25] "Validated CRI v1 image API" Mar 12 12:19:15.631402 master-0 kubenswrapper[4102]: I0312 12:19:15.631359 4102 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 12:19:15.636230 master-0 kubenswrapper[4102]: I0312 12:19:15.636170 4102 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 f1cf7764-854b-4c2c-9df4-b92427278cd1:/dev/vda3] Mar 12 12:19:15.636230 master-0 kubenswrapper[4102]: I0312 12:19:15.636205 4102 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 12 12:19:15.651542 master-0 kubenswrapper[4102]: I0312 12:19:15.651234 4102 manager.go:217] Machine: {Timestamp:2026-03-12 12:19:15.649321724 +0000 UTC m=+0.582098149 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:2fe2bf4a51c343709ee1f99d3a96e3ea SystemUUID:2fe2bf4a-51c3-4370-9ee1-f99d3a96e3ea BootID:473a10fb-d3cc-4f1f-a79c-240b5ee16b09 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:30:db:e2 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:e7:70:da Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:82:08:64:40:9a:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 12:19:15.651542 master-0 kubenswrapper[4102]: I0312 12:19:15.651470 4102 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 12:19:15.651827 master-0 kubenswrapper[4102]: I0312 12:19:15.651723 4102 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 12:19:15.653094 master-0 kubenswrapper[4102]: I0312 12:19:15.653014 4102 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 12 12:19:15.653247 master-0 kubenswrapper[4102]: I0312 12:19:15.653201 4102 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 12:19:15.653558 master-0 kubenswrapper[4102]: I0312 12:19:15.653236 4102 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 12:19:15.653558 master-0 kubenswrapper[4102]: I0312 12:19:15.653530 4102 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 12:19:15.653558 master-0 kubenswrapper[4102]: I0312 12:19:15.653543 4102 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 12:19:15.653558 master-0 kubenswrapper[4102]: I0312 12:19:15.653555 4102 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 12:19:15.653891 master-0 kubenswrapper[4102]: I0312 12:19:15.653579 4102 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 12:19:15.653891 master-0 kubenswrapper[4102]: I0312 12:19:15.653742 4102 state_mem.go:36] "Initialized new in-memory state store" Mar 12 12:19:15.653891 master-0 kubenswrapper[4102]: I0312 12:19:15.653823 4102 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 12 12:19:15.657576 master-0 kubenswrapper[4102]: I0312 12:19:15.657522 4102 kubelet.go:418] "Attempting to sync node with API server" Mar 12 12:19:15.657576 master-0 kubenswrapper[4102]: I0312 12:19:15.657548 4102 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 12:19:15.657735 master-0 kubenswrapper[4102]: I0312 12:19:15.657605 4102 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 12:19:15.657735 master-0 kubenswrapper[4102]: I0312 12:19:15.657622 4102 kubelet.go:324] "Adding apiserver pod source" Mar 12 12:19:15.657735 master-0 kubenswrapper[4102]: I0312 12:19:15.657642 4102 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 12:19:15.661782 master-0 kubenswrapper[4102]: I0312 12:19:15.661716 4102 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 12 12:19:15.663099 master-0 kubenswrapper[4102]: W0312 12:19:15.663000 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:15.663237 master-0 kubenswrapper[4102]: W0312 12:19:15.663080 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:15.663308 master-0 kubenswrapper[4102]: E0312 12:19:15.663270 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:15.663388 master-0 kubenswrapper[4102]: E0312 12:19:15.663176 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:15.664277 master-0 kubenswrapper[4102]: I0312 12:19:15.664192 4102 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 12 12:19:15.664741 master-0 kubenswrapper[4102]: I0312 12:19:15.664670 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 12:19:15.664741 master-0 kubenswrapper[4102]: I0312 12:19:15.664712 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 12:19:15.664741 master-0 kubenswrapper[4102]: I0312 12:19:15.664727 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 12:19:15.664741 master-0 kubenswrapper[4102]: I0312 12:19:15.664742 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664757 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664771 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664786 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664801 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664837 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664851 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664871 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 12:19:15.664986 master-0 kubenswrapper[4102]: I0312 12:19:15.664894 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 12:19:15.668133 master-0 kubenswrapper[4102]: I0312 12:19:15.668030 4102 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 12:19:15.668975 master-0 kubenswrapper[4102]: I0312 12:19:15.668933 4102 server.go:1280] "Started kubelet" Mar 12 12:19:15.671068 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 12 12:19:15.672089 master-0 kubenswrapper[4102]: I0312 12:19:15.671140 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:15.672089 master-0 kubenswrapper[4102]: I0312 12:19:15.671780 4102 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 12:19:15.672089 master-0 kubenswrapper[4102]: I0312 12:19:15.671790 4102 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 12:19:15.672089 master-0 kubenswrapper[4102]: I0312 12:19:15.672035 4102 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 12 12:19:15.672842 master-0 kubenswrapper[4102]: I0312 12:19:15.672790 4102 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 12:19:15.674615 master-0 kubenswrapper[4102]: I0312 12:19:15.674561 4102 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 12 12:19:15.674615 master-0 kubenswrapper[4102]: I0312 12:19:15.674616 4102 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 12:19:15.674997 master-0 kubenswrapper[4102]: I0312 12:19:15.674893 4102 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 12 12:19:15.674997 master-0 kubenswrapper[4102]: I0312 12:19:15.674970 4102 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 12 12:19:15.675187 master-0 kubenswrapper[4102]: I0312 12:19:15.675042 4102 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 12 12:19:15.675187 master-0 kubenswrapper[4102]: E0312 12:19:15.674926 4102 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:19:15.675333 master-0 kubenswrapper[4102]: I0312 12:19:15.675257 4102 reconstruct.go:97] "Volume reconstruction finished" Mar 12 12:19:15.675333 master-0 kubenswrapper[4102]: I0312 12:19:15.675266 4102 reconciler.go:26] "Reconciler: start to sync state" Mar 12 12:19:15.676099 master-0 kubenswrapper[4102]: I0312 12:19:15.676031 4102 factory.go:55] Registering systemd factory Mar 12 12:19:15.676099 master-0 kubenswrapper[4102]: I0312 12:19:15.676079 4102 factory.go:221] Registration of the systemd container factory successfully Mar 12 12:19:15.676373 master-0 kubenswrapper[4102]: E0312 12:19:15.676257 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 12 12:19:15.677227 master-0 kubenswrapper[4102]: I0312 12:19:15.676568 4102 factory.go:153] Registering CRI-O factory Mar 12 12:19:15.677227 master-0 kubenswrapper[4102]: I0312 12:19:15.676604 4102 factory.go:221] Registration of the crio container factory successfully Mar 12 12:19:15.677227 master-0 kubenswrapper[4102]: I0312 12:19:15.676704 4102 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 12:19:15.677227 master-0 kubenswrapper[4102]: I0312 12:19:15.676735 4102 factory.go:103] Registering Raw factory Mar 12 12:19:15.677227 master-0 kubenswrapper[4102]: I0312 12:19:15.676752 4102 manager.go:1196] Started watching for new ooms in manager Mar 12 12:19:15.678147 master-0 kubenswrapper[4102]: W0312 12:19:15.676957 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:15.683202 master-0 kubenswrapper[4102]: E0312 12:19:15.683124 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:15.683202 master-0 kubenswrapper[4102]: E0312 12:19:15.676572 4102 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c174905772d50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,LastTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:15.684599 master-0 kubenswrapper[4102]: I0312 12:19:15.684564 4102 manager.go:319] Starting recovery of all containers Mar 12 12:19:15.685741 master-0 kubenswrapper[4102]: I0312 12:19:15.684769 4102 server.go:449] "Adding debug handlers to kubelet server" Mar 12 12:19:15.687114 master-0 kubenswrapper[4102]: E0312 12:19:15.687054 4102 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 12 12:19:15.717885 master-0 kubenswrapper[4102]: I0312 12:19:15.717844 4102 manager.go:324] Recovery completed Mar 12 12:19:15.735331 master-0 kubenswrapper[4102]: I0312 12:19:15.735304 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.738103 master-0 kubenswrapper[4102]: I0312 12:19:15.738067 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.738331 master-0 kubenswrapper[4102]: I0312 12:19:15.738316 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.738505 master-0 kubenswrapper[4102]: I0312 12:19:15.738460 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.739688 master-0 kubenswrapper[4102]: I0312 12:19:15.739668 4102 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 12 12:19:15.739783 master-0 kubenswrapper[4102]: I0312 12:19:15.739768 4102 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 12:19:15.739858 master-0 kubenswrapper[4102]: I0312 12:19:15.739846 4102 state_mem.go:36] "Initialized new in-memory state store" Mar 12 12:19:15.742739 master-0 kubenswrapper[4102]: I0312 12:19:15.742723 4102 policy_none.go:49] "None policy: Start" Mar 12 12:19:15.743937 master-0 kubenswrapper[4102]: I0312 12:19:15.743852 4102 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 12 12:19:15.743937 master-0 kubenswrapper[4102]: I0312 12:19:15.743885 4102 state_mem.go:35] "Initializing new in-memory state store" Mar 12 12:19:15.776655 master-0 kubenswrapper[4102]: E0312 12:19:15.776215 4102 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:19:15.817640 master-0 kubenswrapper[4102]: I0312 12:19:15.817442 4102 manager.go:334] "Starting Device Plugin manager" Mar 12 12:19:15.817640 master-0 kubenswrapper[4102]: I0312 12:19:15.817529 4102 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 12 12:19:15.817640 master-0 kubenswrapper[4102]: I0312 12:19:15.817544 4102 server.go:79] "Starting device plugin registration server" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.818054 4102 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.818069 4102 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: E0312 12:19:15.821148 4102 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.821174 4102 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.821314 4102 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.821323 4102 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.835951 4102 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.838327 4102 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.838381 4102 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: I0312 12:19:15.838411 4102 kubelet.go:2335] "Starting kubelet main sync loop" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: E0312 12:19:15.838609 4102 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: W0312 12:19:15.839614 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:15.848326 master-0 kubenswrapper[4102]: E0312 12:19:15.839679 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:15.877386 master-0 kubenswrapper[4102]: E0312 12:19:15.877326 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 12 12:19:15.918754 master-0 kubenswrapper[4102]: I0312 12:19:15.918684 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.920410 master-0 kubenswrapper[4102]: I0312 12:19:15.920339 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.920410 master-0 kubenswrapper[4102]: I0312 12:19:15.920399 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.920697 master-0 kubenswrapper[4102]: I0312 12:19:15.920423 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.920697 master-0 kubenswrapper[4102]: I0312 12:19:15.920509 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:15.922023 master-0 kubenswrapper[4102]: E0312 12:19:15.921964 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:15.939695 master-0 kubenswrapper[4102]: I0312 12:19:15.939603 4102 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 12 12:19:15.939858 master-0 kubenswrapper[4102]: I0312 12:19:15.939717 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.940941 master-0 kubenswrapper[4102]: I0312 12:19:15.940827 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.940941 master-0 kubenswrapper[4102]: I0312 12:19:15.940877 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.940941 master-0 kubenswrapper[4102]: I0312 12:19:15.940894 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.942053 master-0 kubenswrapper[4102]: I0312 12:19:15.941040 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.942053 master-0 kubenswrapper[4102]: I0312 12:19:15.941324 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:15.942053 master-0 kubenswrapper[4102]: I0312 12:19:15.941392 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.942452 master-0 kubenswrapper[4102]: I0312 12:19:15.942381 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.942452 master-0 kubenswrapper[4102]: I0312 12:19:15.942446 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.942647 master-0 kubenswrapper[4102]: I0312 12:19:15.942470 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.942647 master-0 kubenswrapper[4102]: I0312 12:19:15.942640 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.942764 master-0 kubenswrapper[4102]: I0312 12:19:15.942657 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.942764 master-0 kubenswrapper[4102]: I0312 12:19:15.942669 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.942877 master-0 kubenswrapper[4102]: I0312 12:19:15.942828 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.943104 master-0 kubenswrapper[4102]: I0312 12:19:15.942923 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.943438 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.944657 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.944701 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.944739 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.944995 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.945026 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.945027 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.945046 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.945592 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.946192 master-0 kubenswrapper[4102]: I0312 12:19:15.945716 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.946310 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.946437 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.946467 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.946927 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.947139 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.947222 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.947920 master-0 kubenswrapper[4102]: I0312 12:19:15.947510 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.948820 master-0 kubenswrapper[4102]: I0312 12:19:15.948425 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.948820 master-0 kubenswrapper[4102]: I0312 12:19:15.948470 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.949130 master-0 kubenswrapper[4102]: I0312 12:19:15.949075 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.949130 master-0 kubenswrapper[4102]: I0312 12:19:15.949119 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.949130 master-0 kubenswrapper[4102]: I0312 12:19:15.949132 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.949364 master-0 kubenswrapper[4102]: I0312 12:19:15.949334 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:15.949456 master-0 kubenswrapper[4102]: I0312 12:19:15.949376 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:15.949456 master-0 kubenswrapper[4102]: I0312 12:19:15.949424 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.949613 master-0 kubenswrapper[4102]: I0312 12:19:15.949550 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.949613 master-0 kubenswrapper[4102]: I0312 12:19:15.949583 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.950145 master-0 kubenswrapper[4102]: I0312 12:19:15.950096 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:15.950247 master-0 kubenswrapper[4102]: I0312 12:19:15.950140 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:15.950342 master-0 kubenswrapper[4102]: I0312 12:19:15.950282 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:15.976754 master-0 kubenswrapper[4102]: I0312 12:19:15.976676 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.976754 master-0 kubenswrapper[4102]: I0312 12:19:15.976721 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.976985 master-0 kubenswrapper[4102]: I0312 12:19:15.976829 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:15.976985 master-0 kubenswrapper[4102]: I0312 12:19:15.976899 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:15.977118 master-0 kubenswrapper[4102]: I0312 12:19:15.977010 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:15.977118 master-0 kubenswrapper[4102]: I0312 12:19:15.977085 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:15.977118 master-0 kubenswrapper[4102]: I0312 12:19:15.977114 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.977293 master-0 kubenswrapper[4102]: I0312 12:19:15.977140 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:15.977293 master-0 kubenswrapper[4102]: I0312 12:19:15.977168 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:15.977293 master-0 kubenswrapper[4102]: I0312 12:19:15.977198 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:15.977293 master-0 kubenswrapper[4102]: I0312 12:19:15.977246 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.977293 master-0 kubenswrapper[4102]: I0312 12:19:15.977277 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:15.977734 master-0 kubenswrapper[4102]: I0312 12:19:15.977304 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:15.977734 master-0 kubenswrapper[4102]: I0312 12:19:15.977336 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:15.977734 master-0 kubenswrapper[4102]: I0312 12:19:15.977368 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.977734 master-0 kubenswrapper[4102]: I0312 12:19:15.977396 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:15.977734 master-0 kubenswrapper[4102]: I0312 12:19:15.977426 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.078638 master-0 kubenswrapper[4102]: I0312 12:19:16.078521 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.078638 master-0 kubenswrapper[4102]: I0312 12:19:16.078579 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:16.078638 master-0 kubenswrapper[4102]: I0312 12:19:16.078608 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:16.078638 master-0 kubenswrapper[4102]: I0312 12:19:16.078629 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.078638 master-0 kubenswrapper[4102]: I0312 12:19:16.078648 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.079233 master-0 kubenswrapper[4102]: I0312 12:19:16.078669 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.079233 master-0 kubenswrapper[4102]: I0312 12:19:16.078860 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.079233 master-0 kubenswrapper[4102]: I0312 12:19:16.079009 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.079233 master-0 kubenswrapper[4102]: I0312 12:19:16.079087 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:16.079233 master-0 kubenswrapper[4102]: I0312 12:19:16.079186 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.079233 master-0 kubenswrapper[4102]: I0312 12:19:16.079237 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079297 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079349 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079309 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079381 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079458 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079586 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:16.079650 master-0 kubenswrapper[4102]: I0312 12:19:16.079638 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079667 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079698 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079713 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079737 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079748 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079772 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079805 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079830 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079843 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079885 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079913 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079940 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079916 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079993 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.079919 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.080105 master-0 kubenswrapper[4102]: I0312 12:19:16.080044 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.123019 master-0 kubenswrapper[4102]: I0312 12:19:16.122934 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:16.124426 master-0 kubenswrapper[4102]: I0312 12:19:16.124244 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:16.124426 master-0 kubenswrapper[4102]: I0312 12:19:16.124306 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:16.124426 master-0 kubenswrapper[4102]: I0312 12:19:16.124332 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:16.124717 master-0 kubenswrapper[4102]: I0312 12:19:16.124473 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:16.125862 master-0 kubenswrapper[4102]: E0312 12:19:16.125790 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:16.279735 master-0 kubenswrapper[4102]: E0312 12:19:16.279649 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 12 12:19:16.282014 master-0 kubenswrapper[4102]: I0312 12:19:16.281962 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:19:16.304563 master-0 kubenswrapper[4102]: I0312 12:19:16.304473 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:19:16.319606 master-0 kubenswrapper[4102]: I0312 12:19:16.319550 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:16.340538 master-0 kubenswrapper[4102]: I0312 12:19:16.340426 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:16.349437 master-0 kubenswrapper[4102]: I0312 12:19:16.349390 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:19:16.526921 master-0 kubenswrapper[4102]: I0312 12:19:16.526851 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:16.528334 master-0 kubenswrapper[4102]: I0312 12:19:16.528280 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:16.528334 master-0 kubenswrapper[4102]: I0312 12:19:16.528325 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:16.528510 master-0 kubenswrapper[4102]: I0312 12:19:16.528342 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:16.528510 master-0 kubenswrapper[4102]: I0312 12:19:16.528404 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:16.529512 master-0 kubenswrapper[4102]: E0312 12:19:16.529424 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:16.673471 master-0 kubenswrapper[4102]: I0312 12:19:16.673292 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:16.760102 master-0 kubenswrapper[4102]: W0312 12:19:16.759948 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:16.760102 master-0 kubenswrapper[4102]: E0312 12:19:16.760106 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:17.035866 master-0 kubenswrapper[4102]: W0312 12:19:17.035755 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9add8df47182fc2eaf8cd78016ebe72.slice/crio-83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe WatchSource:0}: Error finding container 83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe: Status 404 returned error can't find the container with id 83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe Mar 12 12:19:17.041160 master-0 kubenswrapper[4102]: W0312 12:19:17.040706 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:17.041160 master-0 kubenswrapper[4102]: I0312 12:19:17.040793 4102 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:19:17.041160 master-0 kubenswrapper[4102]: E0312 12:19:17.040826 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:17.050339 master-0 kubenswrapper[4102]: W0312 12:19:17.050246 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3 WatchSource:0}: Error finding container 40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3: Status 404 returned error can't find the container with id 40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3 Mar 12 12:19:17.081222 master-0 kubenswrapper[4102]: E0312 12:19:17.081124 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 12 12:19:17.113747 master-0 kubenswrapper[4102]: W0312 12:19:17.113686 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec WatchSource:0}: Error finding container 7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec: Status 404 returned error can't find the container with id 7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec Mar 12 12:19:17.186104 master-0 kubenswrapper[4102]: W0312 12:19:17.186043 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3 WatchSource:0}: Error finding container 71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3: Status 404 returned error can't find the container with id 71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3 Mar 12 12:19:17.194344 master-0 kubenswrapper[4102]: W0312 12:19:17.194251 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:17.194452 master-0 kubenswrapper[4102]: E0312 12:19:17.194354 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:17.235508 master-0 kubenswrapper[4102]: W0312 12:19:17.235337 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320 WatchSource:0}: Error finding container 954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320: Status 404 returned error can't find the container with id 954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320 Mar 12 12:19:17.300396 master-0 kubenswrapper[4102]: W0312 12:19:17.300176 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:17.300396 master-0 kubenswrapper[4102]: E0312 12:19:17.300281 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:17.329901 master-0 kubenswrapper[4102]: I0312 12:19:17.329789 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:17.331512 master-0 kubenswrapper[4102]: I0312 12:19:17.331421 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:17.331512 master-0 kubenswrapper[4102]: I0312 12:19:17.331463 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:17.331734 master-0 kubenswrapper[4102]: I0312 12:19:17.331520 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:17.331734 master-0 kubenswrapper[4102]: I0312 12:19:17.331623 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:17.332688 master-0 kubenswrapper[4102]: E0312 12:19:17.332629 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:17.629282 master-0 kubenswrapper[4102]: I0312 12:19:17.629111 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 12:19:17.631196 master-0 kubenswrapper[4102]: E0312 12:19:17.631133 4102 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:17.643956 master-0 kubenswrapper[4102]: E0312 12:19:17.643724 4102 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c174905772d50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,LastTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:17.673430 master-0 kubenswrapper[4102]: I0312 12:19:17.673320 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:17.845112 master-0 kubenswrapper[4102]: I0312 12:19:17.845021 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec"} Mar 12 12:19:17.846037 master-0 kubenswrapper[4102]: I0312 12:19:17.845987 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3"} Mar 12 12:19:17.847128 master-0 kubenswrapper[4102]: I0312 12:19:17.847076 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe"} Mar 12 12:19:17.848088 master-0 kubenswrapper[4102]: I0312 12:19:17.848039 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320"} Mar 12 12:19:17.849300 master-0 kubenswrapper[4102]: I0312 12:19:17.849254 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3"} Mar 12 12:19:18.438232 master-0 kubenswrapper[4102]: W0312 12:19:18.438178 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:18.438232 master-0 kubenswrapper[4102]: E0312 12:19:18.438235 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:18.672629 master-0 kubenswrapper[4102]: I0312 12:19:18.672576 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:18.682625 master-0 kubenswrapper[4102]: E0312 12:19:18.682584 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 12 12:19:18.859084 master-0 kubenswrapper[4102]: W0312 12:19:18.858974 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:18.859084 master-0 kubenswrapper[4102]: E0312 12:19:18.859029 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:18.933125 master-0 kubenswrapper[4102]: I0312 12:19:18.933048 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:18.934549 master-0 kubenswrapper[4102]: I0312 12:19:18.934458 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:18.934652 master-0 kubenswrapper[4102]: I0312 12:19:18.934560 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:18.934652 master-0 kubenswrapper[4102]: I0312 12:19:18.934570 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:18.934652 master-0 kubenswrapper[4102]: I0312 12:19:18.934622 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:18.935519 master-0 kubenswrapper[4102]: E0312 12:19:18.935453 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:19.440555 master-0 kubenswrapper[4102]: W0312 12:19:19.440505 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:19.440738 master-0 kubenswrapper[4102]: E0312 12:19:19.440575 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:19.672401 master-0 kubenswrapper[4102]: I0312 12:19:19.672341 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:19.864454 master-0 kubenswrapper[4102]: W0312 12:19:19.864326 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:19.864454 master-0 kubenswrapper[4102]: E0312 12:19:19.864417 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:20.947789 master-0 kubenswrapper[4102]: I0312 12:19:20.947499 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:20.951989 master-0 kubenswrapper[4102]: I0312 12:19:20.951803 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6"} Mar 12 12:19:20.958372 master-0 kubenswrapper[4102]: I0312 12:19:20.958320 4102 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="db3181f1b8f0872f0e0be3a238121d5924cf03894cfb4adac07406dd2be14404" exitCode=0 Mar 12 12:19:20.958372 master-0 kubenswrapper[4102]: I0312 12:19:20.958372 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"db3181f1b8f0872f0e0be3a238121d5924cf03894cfb4adac07406dd2be14404"} Mar 12 12:19:20.958551 master-0 kubenswrapper[4102]: I0312 12:19:20.958492 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:20.960891 master-0 kubenswrapper[4102]: I0312 12:19:20.960797 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:20.961024 master-0 kubenswrapper[4102]: I0312 12:19:20.960954 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:20.961176 master-0 kubenswrapper[4102]: I0312 12:19:20.961141 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:21.641449 master-0 kubenswrapper[4102]: I0312 12:19:21.641226 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 12:19:21.643816 master-0 kubenswrapper[4102]: E0312 12:19:21.643761 4102 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:21.672540 master-0 kubenswrapper[4102]: I0312 12:19:21.672450 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:21.913176 master-0 kubenswrapper[4102]: E0312 12:19:21.912977 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 12 12:19:21.963796 master-0 kubenswrapper[4102]: I0312 12:19:21.963725 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714"} Mar 12 12:19:21.963796 master-0 kubenswrapper[4102]: I0312 12:19:21.963780 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:21.964726 master-0 kubenswrapper[4102]: I0312 12:19:21.964690 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:21.964726 master-0 kubenswrapper[4102]: I0312 12:19:21.964719 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:21.964726 master-0 kubenswrapper[4102]: I0312 12:19:21.964726 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:21.965301 master-0 kubenswrapper[4102]: I0312 12:19:21.965263 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 12 12:19:21.965693 master-0 kubenswrapper[4102]: I0312 12:19:21.965656 4102 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="36ab76b564072b6646c2b942849822493968776ee5f073af67206d3fce46e2ee" exitCode=1 Mar 12 12:19:21.965693 master-0 kubenswrapper[4102]: I0312 12:19:21.965687 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"36ab76b564072b6646c2b942849822493968776ee5f073af67206d3fce46e2ee"} Mar 12 12:19:21.965793 master-0 kubenswrapper[4102]: I0312 12:19:21.965745 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:21.966342 master-0 kubenswrapper[4102]: I0312 12:19:21.966308 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:21.966342 master-0 kubenswrapper[4102]: I0312 12:19:21.966338 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:21.966435 master-0 kubenswrapper[4102]: I0312 12:19:21.966348 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:21.966564 master-0 kubenswrapper[4102]: I0312 12:19:21.966536 4102 scope.go:117] "RemoveContainer" containerID="36ab76b564072b6646c2b942849822493968776ee5f073af67206d3fce46e2ee" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: I0312 12:19:22.135715 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: I0312 12:19:22.136597 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: I0312 12:19:22.136615 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: I0312 12:19:22.136623 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: I0312 12:19:22.136661 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: E0312 12:19:22.137229 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: W0312 12:19:22.144827 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:22.272605 master-0 kubenswrapper[4102]: E0312 12:19:22.144879 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:22.672540 master-0 kubenswrapper[4102]: I0312 12:19:22.672464 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:22.969470 master-0 kubenswrapper[4102]: I0312 12:19:22.969349 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 12 12:19:22.970135 master-0 kubenswrapper[4102]: I0312 12:19:22.970080 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 12 12:19:22.971175 master-0 kubenswrapper[4102]: I0312 12:19:22.970337 4102 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="db23106411152aa66583975d4e7af811e346c60413a32e2811259c380fc87177" exitCode=1 Mar 12 12:19:22.971175 master-0 kubenswrapper[4102]: I0312 12:19:22.970416 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:22.971175 master-0 kubenswrapper[4102]: I0312 12:19:22.970845 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:22.971175 master-0 kubenswrapper[4102]: I0312 12:19:22.971059 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"db23106411152aa66583975d4e7af811e346c60413a32e2811259c380fc87177"} Mar 12 12:19:22.971175 master-0 kubenswrapper[4102]: I0312 12:19:22.971095 4102 scope.go:117] "RemoveContainer" containerID="36ab76b564072b6646c2b942849822493968776ee5f073af67206d3fce46e2ee" Mar 12 12:19:22.971970 master-0 kubenswrapper[4102]: I0312 12:19:22.971502 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:22.971970 master-0 kubenswrapper[4102]: I0312 12:19:22.971532 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:22.971970 master-0 kubenswrapper[4102]: I0312 12:19:22.971543 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:22.971970 master-0 kubenswrapper[4102]: I0312 12:19:22.971813 4102 scope.go:117] "RemoveContainer" containerID="db23106411152aa66583975d4e7af811e346c60413a32e2811259c380fc87177" Mar 12 12:19:22.972247 master-0 kubenswrapper[4102]: E0312 12:19:22.972017 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 12 12:19:22.972247 master-0 kubenswrapper[4102]: I0312 12:19:22.972038 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:22.972247 master-0 kubenswrapper[4102]: I0312 12:19:22.972085 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:22.972247 master-0 kubenswrapper[4102]: I0312 12:19:22.972093 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:23.466926 master-0 kubenswrapper[4102]: W0312 12:19:23.466867 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:23.467075 master-0 kubenswrapper[4102]: E0312 12:19:23.466931 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:23.672898 master-0 kubenswrapper[4102]: I0312 12:19:23.672825 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:23.974980 master-0 kubenswrapper[4102]: I0312 12:19:23.974915 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 12 12:19:23.976414 master-0 kubenswrapper[4102]: I0312 12:19:23.976353 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:23.977275 master-0 kubenswrapper[4102]: I0312 12:19:23.977226 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:23.977347 master-0 kubenswrapper[4102]: I0312 12:19:23.977281 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:23.977347 master-0 kubenswrapper[4102]: I0312 12:19:23.977298 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:23.977804 master-0 kubenswrapper[4102]: I0312 12:19:23.977766 4102 scope.go:117] "RemoveContainer" containerID="db23106411152aa66583975d4e7af811e346c60413a32e2811259c380fc87177" Mar 12 12:19:23.978030 master-0 kubenswrapper[4102]: E0312 12:19:23.977985 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 12 12:19:24.672737 master-0 kubenswrapper[4102]: I0312 12:19:24.672685 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:24.888442 master-0 kubenswrapper[4102]: W0312 12:19:24.888291 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:24.888658 master-0 kubenswrapper[4102]: E0312 12:19:24.888435 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:25.229519 master-0 kubenswrapper[4102]: W0312 12:19:25.229376 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:25.229519 master-0 kubenswrapper[4102]: E0312 12:19:25.229462 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:25.673462 master-0 kubenswrapper[4102]: I0312 12:19:25.673097 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:25.821299 master-0 kubenswrapper[4102]: E0312 12:19:25.821230 4102 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 12 12:19:26.672920 master-0 kubenswrapper[4102]: I0312 12:19:26.672871 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:27.645646 master-0 kubenswrapper[4102]: E0312 12:19:27.645406 4102 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c174905772d50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,LastTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:27.769323 master-0 kubenswrapper[4102]: I0312 12:19:27.718437 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:28.313966 master-0 kubenswrapper[4102]: E0312 12:19:28.313897 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="7s" Mar 12 12:19:28.537703 master-0 kubenswrapper[4102]: I0312 12:19:28.537613 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:28.538538 master-0 kubenswrapper[4102]: I0312 12:19:28.538503 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:28.538598 master-0 kubenswrapper[4102]: I0312 12:19:28.538544 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:28.538598 master-0 kubenswrapper[4102]: I0312 12:19:28.538552 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:28.538598 master-0 kubenswrapper[4102]: I0312 12:19:28.538586 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:28.539240 master-0 kubenswrapper[4102]: E0312 12:19:28.539203 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 12 12:19:28.673656 master-0 kubenswrapper[4102]: I0312 12:19:28.673504 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:29.672374 master-0 kubenswrapper[4102]: I0312 12:19:29.672258 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:29.868932 master-0 kubenswrapper[4102]: W0312 12:19:29.868814 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 12 12:19:29.868932 master-0 kubenswrapper[4102]: E0312 12:19:29.868928 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:29.989775 master-0 kubenswrapper[4102]: I0312 12:19:29.989734 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:29.989957 master-0 kubenswrapper[4102]: I0312 12:19:29.989765 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d"} Mar 12 12:19:29.990598 master-0 kubenswrapper[4102]: I0312 12:19:29.990562 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:29.990673 master-0 kubenswrapper[4102]: I0312 12:19:29.990616 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:29.990673 master-0 kubenswrapper[4102]: I0312 12:19:29.990636 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:29.991717 master-0 kubenswrapper[4102]: I0312 12:19:29.991668 4102 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03" exitCode=0 Mar 12 12:19:29.991822 master-0 kubenswrapper[4102]: I0312 12:19:29.991787 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03"} Mar 12 12:19:29.991857 master-0 kubenswrapper[4102]: I0312 12:19:29.991800 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:29.992702 master-0 kubenswrapper[4102]: I0312 12:19:29.992667 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:29.992752 master-0 kubenswrapper[4102]: I0312 12:19:29.992714 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:29.992752 master-0 kubenswrapper[4102]: I0312 12:19:29.992732 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:29.993470 master-0 kubenswrapper[4102]: I0312 12:19:29.993411 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"7832f5ffa2c5ed0b1534228e4894dd1d7b32cd0726e9bdedd6ffb73456947fa0"} Mar 12 12:19:29.998100 master-0 kubenswrapper[4102]: I0312 12:19:29.998067 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:29.998989 master-0 kubenswrapper[4102]: I0312 12:19:29.998957 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:29.999019 master-0 kubenswrapper[4102]: I0312 12:19:29.999001 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:29.999094 master-0 kubenswrapper[4102]: I0312 12:19:29.999019 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:30.149641 master-0 kubenswrapper[4102]: I0312 12:19:30.149551 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 12:19:30.150943 master-0 kubenswrapper[4102]: E0312 12:19:30.150907 4102 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 12 12:19:31.000902 master-0 kubenswrapper[4102]: I0312 12:19:31.000847 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:31.001405 master-0 kubenswrapper[4102]: I0312 12:19:31.000902 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12"} Mar 12 12:19:31.001555 master-0 kubenswrapper[4102]: I0312 12:19:31.001525 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:31.001596 master-0 kubenswrapper[4102]: I0312 12:19:31.001577 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:31.001596 master-0 kubenswrapper[4102]: I0312 12:19:31.001587 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:31.596187 master-0 kubenswrapper[4102]: I0312 12:19:31.596144 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:31.676894 master-0 kubenswrapper[4102]: I0312 12:19:31.676828 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:31.696296 master-0 kubenswrapper[4102]: W0312 12:19:31.696254 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 12 12:19:31.696545 master-0 kubenswrapper[4102]: E0312 12:19:31.696313 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 12 12:19:32.677585 master-0 kubenswrapper[4102]: I0312 12:19:32.677536 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:33.007816 master-0 kubenswrapper[4102]: I0312 12:19:33.007748 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"96e48ce640071ce1e3c5822f8f356a843319ddfef771dec6cce93f02b2946dac"} Mar 12 12:19:33.008015 master-0 kubenswrapper[4102]: I0312 12:19:33.007913 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:33.009597 master-0 kubenswrapper[4102]: I0312 12:19:33.009094 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:33.009597 master-0 kubenswrapper[4102]: I0312 12:19:33.009116 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:33.009597 master-0 kubenswrapper[4102]: I0312 12:19:33.009125 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:33.010871 master-0 kubenswrapper[4102]: I0312 12:19:33.010802 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f"} Mar 12 12:19:33.010923 master-0 kubenswrapper[4102]: I0312 12:19:33.010888 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:33.012001 master-0 kubenswrapper[4102]: I0312 12:19:33.011737 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:33.012001 master-0 kubenswrapper[4102]: I0312 12:19:33.011758 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:33.012001 master-0 kubenswrapper[4102]: I0312 12:19:33.011766 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:33.114153 master-0 kubenswrapper[4102]: W0312 12:19:33.114028 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 12 12:19:33.114153 master-0 kubenswrapper[4102]: E0312 12:19:33.114109 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 12:19:33.149856 master-0 kubenswrapper[4102]: I0312 12:19:33.149817 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:33.679891 master-0 kubenswrapper[4102]: I0312 12:19:33.679833 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:33.997414 master-0 kubenswrapper[4102]: W0312 12:19:33.997354 4102 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 12 12:19:33.997684 master-0 kubenswrapper[4102]: E0312 12:19:33.997435 4102 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 12 12:19:34.013454 master-0 kubenswrapper[4102]: I0312 12:19:34.013405 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:34.013699 master-0 kubenswrapper[4102]: I0312 12:19:34.013421 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:34.014904 master-0 kubenswrapper[4102]: I0312 12:19:34.014843 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:34.014989 master-0 kubenswrapper[4102]: I0312 12:19:34.014912 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:34.014989 master-0 kubenswrapper[4102]: I0312 12:19:34.014934 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:34.015128 master-0 kubenswrapper[4102]: I0312 12:19:34.015078 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:34.015174 master-0 kubenswrapper[4102]: I0312 12:19:34.015138 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:34.015262 master-0 kubenswrapper[4102]: I0312 12:19:34.015230 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:34.497617 master-0 kubenswrapper[4102]: I0312 12:19:34.497538 4102 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:34.504465 master-0 kubenswrapper[4102]: I0312 12:19:34.504413 4102 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:34.679245 master-0 kubenswrapper[4102]: I0312 12:19:34.679164 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:34.833090 master-0 kubenswrapper[4102]: I0312 12:19:34.832923 4102 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:34.839996 master-0 kubenswrapper[4102]: I0312 12:19:34.839905 4102 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:35.015681 master-0 kubenswrapper[4102]: I0312 12:19:35.015641 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:35.016110 master-0 kubenswrapper[4102]: I0312 12:19:35.015714 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:35.016201 master-0 kubenswrapper[4102]: I0312 12:19:35.015745 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:35.017457 master-0 kubenswrapper[4102]: I0312 12:19:35.017433 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:35.017649 master-0 kubenswrapper[4102]: I0312 12:19:35.017627 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:35.017781 master-0 kubenswrapper[4102]: I0312 12:19:35.017761 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:35.017935 master-0 kubenswrapper[4102]: I0312 12:19:35.017571 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:35.018010 master-0 kubenswrapper[4102]: I0312 12:19:35.017943 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:35.018010 master-0 kubenswrapper[4102]: I0312 12:19:35.017962 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:35.020731 master-0 kubenswrapper[4102]: I0312 12:19:35.020692 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:19:35.319789 master-0 kubenswrapper[4102]: E0312 12:19:35.319749 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 12:19:35.540140 master-0 kubenswrapper[4102]: I0312 12:19:35.540061 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:35.541743 master-0 kubenswrapper[4102]: I0312 12:19:35.541714 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:35.541864 master-0 kubenswrapper[4102]: I0312 12:19:35.541848 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:35.541973 master-0 kubenswrapper[4102]: I0312 12:19:35.541958 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:35.542126 master-0 kubenswrapper[4102]: I0312 12:19:35.542110 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:35.552209 master-0 kubenswrapper[4102]: E0312 12:19:35.552155 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 12 12:19:35.678362 master-0 kubenswrapper[4102]: I0312 12:19:35.678179 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:35.822404 master-0 kubenswrapper[4102]: E0312 12:19:35.822296 4102 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 12 12:19:35.839760 master-0 kubenswrapper[4102]: I0312 12:19:35.839710 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:35.841421 master-0 kubenswrapper[4102]: I0312 12:19:35.841373 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:35.841560 master-0 kubenswrapper[4102]: I0312 12:19:35.841439 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:35.841560 master-0 kubenswrapper[4102]: I0312 12:19:35.841466 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:35.842252 master-0 kubenswrapper[4102]: I0312 12:19:35.842199 4102 scope.go:117] "RemoveContainer" containerID="db23106411152aa66583975d4e7af811e346c60413a32e2811259c380fc87177" Mar 12 12:19:36.017393 master-0 kubenswrapper[4102]: I0312 12:19:36.017301 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:36.017464 master-0 kubenswrapper[4102]: I0312 12:19:36.017301 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:36.018171 master-0 kubenswrapper[4102]: I0312 12:19:36.018135 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:36.018229 master-0 kubenswrapper[4102]: I0312 12:19:36.018178 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:36.018258 master-0 kubenswrapper[4102]: I0312 12:19:36.018233 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:36.018798 master-0 kubenswrapper[4102]: I0312 12:19:36.018748 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:36.018859 master-0 kubenswrapper[4102]: I0312 12:19:36.018812 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:36.018859 master-0 kubenswrapper[4102]: I0312 12:19:36.018832 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:36.678831 master-0 kubenswrapper[4102]: I0312 12:19:36.678773 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:37.022411 master-0 kubenswrapper[4102]: I0312 12:19:37.022344 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 12 12:19:37.023373 master-0 kubenswrapper[4102]: I0312 12:19:37.023130 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 12 12:19:37.023897 master-0 kubenswrapper[4102]: I0312 12:19:37.023834 4102 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27" exitCode=1 Mar 12 12:19:37.024015 master-0 kubenswrapper[4102]: I0312 12:19:37.023900 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27"} Mar 12 12:19:37.024015 master-0 kubenswrapper[4102]: I0312 12:19:37.023968 4102 scope.go:117] "RemoveContainer" containerID="db23106411152aa66583975d4e7af811e346c60413a32e2811259c380fc87177" Mar 12 12:19:37.024015 master-0 kubenswrapper[4102]: I0312 12:19:37.023993 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:37.024251 master-0 kubenswrapper[4102]: I0312 12:19:37.024134 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:37.025445 master-0 kubenswrapper[4102]: I0312 12:19:37.025326 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:37.025445 master-0 kubenswrapper[4102]: I0312 12:19:37.025393 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:37.025445 master-0 kubenswrapper[4102]: I0312 12:19:37.025421 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:37.025998 master-0 kubenswrapper[4102]: I0312 12:19:37.025963 4102 scope.go:117] "RemoveContainer" containerID="e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27" Mar 12 12:19:37.026330 master-0 kubenswrapper[4102]: E0312 12:19:37.026252 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 12 12:19:37.026577 master-0 kubenswrapper[4102]: I0312 12:19:37.026383 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:37.026577 master-0 kubenswrapper[4102]: I0312 12:19:37.026414 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:37.026577 master-0 kubenswrapper[4102]: I0312 12:19:37.026430 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:37.653789 master-0 kubenswrapper[4102]: E0312 12:19:37.653572 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c174905772d50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,LastTimestamp:2026-03-12 12:19:15.66888072 +0000 UTC m=+0.601657175,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.661471 master-0 kubenswrapper[4102]: E0312 12:19:37.661251 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.667710 master-0 kubenswrapper[4102]: E0312 12:19:37.667388 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.675044 master-0 kubenswrapper[4102]: E0312 12:19:37.674897 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.675600 master-0 kubenswrapper[4102]: I0312 12:19:37.675413 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:37.681359 master-0 kubenswrapper[4102]: E0312 12:19:37.681163 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490eb47a25 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.823893029 +0000 UTC m=+0.756669454,LastTimestamp:2026-03-12 12:19:15.823893029 +0000 UTC m=+0.756669454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.689264 master-0 kubenswrapper[4102]: E0312 12:19:37.688976 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.920375317 +0000 UTC m=+0.853151772,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.693872 master-0 kubenswrapper[4102]: E0312 12:19:37.693736 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.920414079 +0000 UTC m=+0.853190534,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.700410 master-0 kubenswrapper[4102]: E0312 12:19:37.700266 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099ecfe4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.920436021 +0000 UTC m=+0.853212476,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.705757 master-0 kubenswrapper[4102]: E0312 12:19:37.705552 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.940858835 +0000 UTC m=+0.873635280,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.713505 master-0 kubenswrapper[4102]: E0312 12:19:37.713291 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.940888057 +0000 UTC m=+0.873664502,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.718835 master-0 kubenswrapper[4102]: E0312 12:19:37.718638 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099ecfe4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.940904718 +0000 UTC m=+0.873681163,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.723941 master-0 kubenswrapper[4102]: E0312 12:19:37.723759 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.942424403 +0000 UTC m=+0.875200848,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.729511 master-0 kubenswrapper[4102]: E0312 12:19:37.729338 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.942462075 +0000 UTC m=+0.875238530,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.735366 master-0 kubenswrapper[4102]: E0312 12:19:37.735200 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099ecfe4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.942525169 +0000 UTC m=+0.875301614,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.740607 master-0 kubenswrapper[4102]: E0312 12:19:37.740379 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.942660756 +0000 UTC m=+0.875437171,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.748517 master-0 kubenswrapper[4102]: E0312 12:19:37.748340 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.942764002 +0000 UTC m=+0.875540447,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.756977 master-0 kubenswrapper[4102]: E0312 12:19:37.756425 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099ecfe4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.942851327 +0000 UTC m=+0.875627772,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.762907 master-0 kubenswrapper[4102]: E0312 12:19:37.762728 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.94468637 +0000 UTC m=+0.877462825,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.768852 master-0 kubenswrapper[4102]: E0312 12:19:37.768644 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.944715002 +0000 UTC m=+0.877491447,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.774960 master-0 kubenswrapper[4102]: E0312 12:19:37.774455 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099ecfe4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.944752454 +0000 UTC m=+0.877528909,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.781179 master-0 kubenswrapper[4102]: E0312 12:19:37.780986 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.945014768 +0000 UTC m=+0.877791193,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.786580 master-0 kubenswrapper[4102]: E0312 12:19:37.786377 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.945033809 +0000 UTC m=+0.877810234,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.793553 master-0 kubenswrapper[4102]: E0312 12:19:37.793344 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099ecfe4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099ecfe4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.738587108 +0000 UTC m=+0.671363533,LastTimestamp:2026-03-12 12:19:15.945055141 +0000 UTC m=+0.877831576,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.798635 master-0 kubenswrapper[4102]: E0312 12:19:37.798444 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c17490999fd22\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c17490999fd22 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73827101 +0000 UTC m=+0.671047445,LastTimestamp:2026-03-12 12:19:15.946414967 +0000 UTC m=+0.879191422,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.804908 master-0 kubenswrapper[4102]: E0312 12:19:37.804731 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c1749099cb030\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c1749099cb030 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:15.73844792 +0000 UTC m=+0.671224345,LastTimestamp:2026-03-12 12:19:15.946453099 +0000 UTC m=+0.879229554,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.811237 master-0 kubenswrapper[4102]: E0312 12:19:37.811068 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c1749573b6762 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:17.040695138 +0000 UTC m=+1.973471593,LastTimestamp:2026-03-12 12:19:17.040695138 +0000 UTC m=+1.973471593,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.819238 master-0 kubenswrapper[4102]: E0312 12:19:37.819092 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174957f93c85 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:17.053136005 +0000 UTC m=+1.985912450,LastTimestamp:2026-03-12 12:19:17.053136005 +0000 UTC m=+1.985912450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.825431 master-0 kubenswrapper[4102]: E0312 12:19:37.825256 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c17495bce3ca8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:17.117426856 +0000 UTC m=+2.050203271,LastTimestamp:2026-03-12 12:19:17.117426856 +0000 UTC m=+2.050203271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.830957 master-0 kubenswrapper[4102]: E0312 12:19:37.830774 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c1749602485bd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:17.190190525 +0000 UTC m=+2.122966980,LastTimestamp:2026-03-12 12:19:17.190190525 +0000 UTC m=+2.122966980,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.836913 master-0 kubenswrapper[4102]: E0312 12:19:37.836716 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c1749630be136 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:17.23890719 +0000 UTC m=+2.171683645,LastTimestamp:2026-03-12 12:19:17.23890719 +0000 UTC m=+2.171683645,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.846775 master-0 kubenswrapper[4102]: E0312 12:19:37.843649 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a1ab7f1be openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 3.279s (3.279s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.320414142 +0000 UTC m=+5.253190567,LastTimestamp:2026-03-12 12:19:20.320414142 +0000 UTC m=+5.253190567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.853119 master-0 kubenswrapper[4102]: E0312 12:19:37.852623 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174a1bac6e80 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 3.283s (3.283s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.336436864 +0000 UTC m=+5.269213279,LastTimestamp:2026-03-12 12:19:20.336436864 +0000 UTC m=+5.269213279,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.858325 master-0 kubenswrapper[4102]: E0312 12:19:37.858220 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a250c8ab5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.493730485 +0000 UTC m=+5.426506900,LastTimestamp:2026-03-12 12:19:20.493730485 +0000 UTC m=+5.426506900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.863385 master-0 kubenswrapper[4102]: E0312 12:19:37.862937 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174a25123afe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.494103294 +0000 UTC m=+5.426879709,LastTimestamp:2026-03-12 12:19:20.494103294 +0000 UTC m=+5.426879709,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.867983 master-0 kubenswrapper[4102]: E0312 12:19:37.867854 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a25aba312 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.504156946 +0000 UTC m=+5.436933361,LastTimestamp:2026-03-12 12:19:20.504156946 +0000 UTC m=+5.436933361,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.872563 master-0 kubenswrapper[4102]: E0312 12:19:37.872459 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174a25f4d35b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.508953435 +0000 UTC m=+5.441729850,LastTimestamp:2026-03-12 12:19:20.508953435 +0000 UTC m=+5.441729850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.877044 master-0 kubenswrapper[4102]: E0312 12:19:37.876942 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174a26619879 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.516081785 +0000 UTC m=+5.448858200,LastTimestamp:2026-03-12 12:19:20.516081785 +0000 UTC m=+5.448858200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.881990 master-0 kubenswrapper[4102]: E0312 12:19:37.881862 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4130b9fc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.965863932 +0000 UTC m=+5.898640347,LastTimestamp:2026-03-12 12:19:20.965863932 +0000 UTC m=+5.898640347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.888544 master-0 kubenswrapper[4102]: E0312 12:19:37.888353 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174a47619237 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.069728311 +0000 UTC m=+6.002504726,LastTimestamp:2026-03-12 12:19:21.069728311 +0000 UTC m=+6.002504726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.894356 master-0 kubenswrapper[4102]: E0312 12:19:37.894241 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c174a48360f10 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.083653904 +0000 UTC m=+6.016430319,LastTimestamp:2026-03-12 12:19:21.083653904 +0000 UTC m=+6.016430319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.899950 master-0 kubenswrapper[4102]: E0312 12:19:37.899851 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4c905281 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.156678273 +0000 UTC m=+6.089454688,LastTimestamp:2026-03-12 12:19:21.156678273 +0000 UTC m=+6.089454688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.905193 master-0 kubenswrapper[4102]: E0312 12:19:37.905069 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4d1dcea0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.165950624 +0000 UTC m=+6.098727039,LastTimestamp:2026-03-12 12:19:21.165950624 +0000 UTC m=+6.098727039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.911846 master-0 kubenswrapper[4102]: E0312 12:19:37.911717 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174a4130b9fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4130b9fc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.965863932 +0000 UTC m=+5.898640347,LastTimestamp:2026-03-12 12:19:21.969361268 +0000 UTC m=+6.902137683,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.919054 master-0 kubenswrapper[4102]: E0312 12:19:37.917853 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174a4c905281\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4c905281 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.156678273 +0000 UTC m=+6.089454688,LastTimestamp:2026-03-12 12:19:22.522184803 +0000 UTC m=+7.454961208,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.924449 master-0 kubenswrapper[4102]: E0312 12:19:37.923648 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174a4d1dcea0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4d1dcea0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.165950624 +0000 UTC m=+6.098727039,LastTimestamp:2026-03-12 12:19:22.536499085 +0000 UTC m=+7.469275500,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.930037 master-0 kubenswrapper[4102]: E0312 12:19:37.929911 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174ab8c3babe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:22.971986622 +0000 UTC m=+7.904763037,LastTimestamp:2026-03-12 12:19:22.971986622 +0000 UTC m=+7.904763037,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.934780 master-0 kubenswrapper[4102]: E0312 12:19:37.934634 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174ab8c3babe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174ab8c3babe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:22.971986622 +0000 UTC m=+7.904763037,LastTimestamp:2026-03-12 12:19:23.977949026 +0000 UTC m=+8.910725481,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.939701 master-0 kubenswrapper[4102]: E0312 12:19:37.939596 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c37060e55 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 12.267s (12.267s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.385229909 +0000 UTC m=+14.318006334,LastTimestamp:2026-03-12 12:19:29.385229909 +0000 UTC m=+14.318006334,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.943620 master-0 kubenswrapper[4102]: E0312 12:19:37.943542 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174c3d3590b0 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 12.25s (12.25s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.489006768 +0000 UTC m=+14.421783183,LastTimestamp:2026-03-12 12:19:29.489006768 +0000 UTC m=+14.421783183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.949448 master-0 kubenswrapper[4102]: E0312 12:19:37.949345 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c4247dd43 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.574092099 +0000 UTC m=+14.506868504,LastTimestamp:2026-03-12 12:19:29.574092099 +0000 UTC m=+14.506868504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.957984 master-0 kubenswrapper[4102]: E0312 12:19:37.956656 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c174c42a7c6ef kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 12.39s (12.39s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.580377839 +0000 UTC m=+14.513154254,LastTimestamp:2026-03-12 12:19:29.580377839 +0000 UTC m=+14.513154254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.965514 master-0 kubenswrapper[4102]: E0312 12:19:37.965151 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c42e58ed3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.584426707 +0000 UTC m=+14.517203122,LastTimestamp:2026-03-12 12:19:29.584426707 +0000 UTC m=+14.517203122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.972260 master-0 kubenswrapper[4102]: E0312 12:19:37.972101 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174c4a52d6a1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.709029025 +0000 UTC m=+14.641805460,LastTimestamp:2026-03-12 12:19:29.709029025 +0000 UTC m=+14.641805460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.979275 master-0 kubenswrapper[4102]: E0312 12:19:37.979177 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174c4af07a17 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.719360023 +0000 UTC m=+14.652136448,LastTimestamp:2026-03-12 12:19:29.719360023 +0000 UTC m=+14.652136448,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.986185 master-0 kubenswrapper[4102]: E0312 12:19:37.986093 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174c4b00f811 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.720440849 +0000 UTC m=+14.653217284,LastTimestamp:2026-03-12 12:19:29.720440849 +0000 UTC m=+14.653217284,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.991600 master-0 kubenswrapper[4102]: E0312 12:19:37.991287 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c174c4bea925c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.735750236 +0000 UTC m=+14.668526651,LastTimestamp:2026-03-12 12:19:29.735750236 +0000 UTC m=+14.668526651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:37.999743 master-0 kubenswrapper[4102]: E0312 12:19:37.999573 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c174c4d3859d7 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.757624791 +0000 UTC m=+14.690401206,LastTimestamp:2026-03-12 12:19:29.757624791 +0000 UTC m=+14.690401206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.005566 master-0 kubenswrapper[4102]: E0312 12:19:38.005403 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c5b8c0d12 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:29.997991186 +0000 UTC m=+14.930767641,LastTimestamp:2026-03-12 12:19:29.997991186 +0000 UTC m=+14.930767641,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.010324 master-0 kubenswrapper[4102]: E0312 12:19:38.010222 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c68cdc91b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:30.220402971 +0000 UTC m=+15.153179386,LastTimestamp:2026-03-12 12:19:30.220402971 +0000 UTC m=+15.153179386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.014905 master-0 kubenswrapper[4102]: E0312 12:19:38.014794 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c69387a92 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:30.227395218 +0000 UTC m=+15.160171633,LastTimestamp:2026-03-12 12:19:30.227395218 +0000 UTC m=+15.160171633,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.020581 master-0 kubenswrapper[4102]: E0312 12:19:38.019031 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174c6941dfdd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:30.228010973 +0000 UTC m=+15.160787388,LastTimestamp:2026-03-12 12:19:30.228010973 +0000 UTC m=+15.160787388,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.023704 master-0 kubenswrapper[4102]: E0312 12:19:38.023612 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174cf0f77627 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 2.784s (2.784s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:32.504835623 +0000 UTC m=+17.437612078,LastTimestamp:2026-03-12 12:19:32.504835623 +0000 UTC m=+17.437612078,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.027634 master-0 kubenswrapper[4102]: I0312 12:19:38.027608 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 12 12:19:38.029228 master-0 kubenswrapper[4102]: E0312 12:19:38.029118 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174cf19fe973 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 2.287s (2.287s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:32.515875187 +0000 UTC m=+17.448651602,LastTimestamp:2026-03-12 12:19:32.515875187 +0000 UTC m=+17.448651602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.033623 master-0 kubenswrapper[4102]: E0312 12:19:38.033505 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174cfd523d0d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:32.712111373 +0000 UTC m=+17.644887788,LastTimestamp:2026-03-12 12:19:32.712111373 +0000 UTC m=+17.644887788,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.038469 master-0 kubenswrapper[4102]: E0312 12:19:38.038373 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174cfd558094 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:32.712325268 +0000 UTC m=+17.645101683,LastTimestamp:2026-03-12 12:19:32.712325268 +0000 UTC m=+17.645101683,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.043083 master-0 kubenswrapper[4102]: E0312 12:19:38.042991 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c174cfdd9eba2 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:32.721003426 +0000 UTC m=+17.653779841,LastTimestamp:2026-03-12 12:19:32.721003426 +0000 UTC m=+17.653779841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.047618 master-0 kubenswrapper[4102]: E0312 12:19:38.047527 4102 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c174cfde2f945 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:32.721596741 +0000 UTC m=+17.654373146,LastTimestamp:2026-03-12 12:19:32.721596741 +0000 UTC m=+17.654373146,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.053718 master-0 kubenswrapper[4102]: E0312 12:19:38.053580 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174a4130b9fc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4130b9fc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:20.965863932 +0000 UTC m=+5.898640347,LastTimestamp:2026-03-12 12:19:35.845953169 +0000 UTC m=+20.778729634,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.058904 master-0 kubenswrapper[4102]: E0312 12:19:38.058775 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174a4c905281\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4c905281 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.156678273 +0000 UTC m=+6.089454688,LastTimestamp:2026-03-12 12:19:36.071398117 +0000 UTC m=+21.004174542,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.064663 master-0 kubenswrapper[4102]: E0312 12:19:38.064563 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174a4d1dcea0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174a4d1dcea0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:21.165950624 +0000 UTC m=+6.098727039,LastTimestamp:2026-03-12 12:19:36.083703252 +0000 UTC m=+21.016479687,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.069031 master-0 kubenswrapper[4102]: E0312 12:19:38.068929 4102 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c174ab8c3babe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c174ab8c3babe openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:19:22.971986622 +0000 UTC m=+7.904763037,LastTimestamp:2026-03-12 12:19:37.026190927 +0000 UTC m=+21.958967372,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:19:38.679092 master-0 kubenswrapper[4102]: I0312 12:19:38.679023 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:39.431858 master-0 kubenswrapper[4102]: I0312 12:19:39.431802 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:39.432353 master-0 kubenswrapper[4102]: I0312 12:19:39.431949 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:39.432806 master-0 kubenswrapper[4102]: I0312 12:19:39.432788 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:39.432848 master-0 kubenswrapper[4102]: I0312 12:19:39.432813 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:39.432848 master-0 kubenswrapper[4102]: I0312 12:19:39.432824 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:39.677102 master-0 kubenswrapper[4102]: I0312 12:19:39.677055 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:39.924042 master-0 kubenswrapper[4102]: I0312 12:19:39.923828 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:40.033093 master-0 kubenswrapper[4102]: I0312 12:19:40.033023 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:40.034100 master-0 kubenswrapper[4102]: I0312 12:19:40.034063 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:40.034246 master-0 kubenswrapper[4102]: I0312 12:19:40.034111 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:40.034246 master-0 kubenswrapper[4102]: I0312 12:19:40.034132 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:40.679795 master-0 kubenswrapper[4102]: I0312 12:19:40.679736 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:41.679242 master-0 kubenswrapper[4102]: I0312 12:19:41.679185 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:42.329340 master-0 kubenswrapper[4102]: E0312 12:19:42.329241 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 12:19:42.463458 master-0 kubenswrapper[4102]: I0312 12:19:42.463376 4102 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:42.464014 master-0 kubenswrapper[4102]: I0312 12:19:42.463999 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:42.465439 master-0 kubenswrapper[4102]: I0312 12:19:42.465385 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:42.465542 master-0 kubenswrapper[4102]: I0312 12:19:42.465451 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:42.466289 master-0 kubenswrapper[4102]: I0312 12:19:42.465472 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:42.470221 master-0 kubenswrapper[4102]: I0312 12:19:42.470181 4102 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:42.552913 master-0 kubenswrapper[4102]: I0312 12:19:42.552794 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:42.554650 master-0 kubenswrapper[4102]: I0312 12:19:42.554598 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:42.554650 master-0 kubenswrapper[4102]: I0312 12:19:42.554650 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:42.554875 master-0 kubenswrapper[4102]: I0312 12:19:42.554668 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:42.554875 master-0 kubenswrapper[4102]: I0312 12:19:42.554767 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:42.561867 master-0 kubenswrapper[4102]: E0312 12:19:42.561775 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 12 12:19:42.679832 master-0 kubenswrapper[4102]: I0312 12:19:42.679673 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:43.043718 master-0 kubenswrapper[4102]: I0312 12:19:43.042685 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:43.044574 master-0 kubenswrapper[4102]: I0312 12:19:43.044303 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:43.044574 master-0 kubenswrapper[4102]: I0312 12:19:43.044369 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:43.044574 master-0 kubenswrapper[4102]: I0312 12:19:43.044391 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:43.050712 master-0 kubenswrapper[4102]: I0312 12:19:43.050643 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:19:43.678541 master-0 kubenswrapper[4102]: I0312 12:19:43.678442 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:44.045343 master-0 kubenswrapper[4102]: I0312 12:19:44.045249 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:44.046685 master-0 kubenswrapper[4102]: I0312 12:19:44.046638 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:44.046754 master-0 kubenswrapper[4102]: I0312 12:19:44.046692 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:44.046754 master-0 kubenswrapper[4102]: I0312 12:19:44.046713 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:44.679235 master-0 kubenswrapper[4102]: I0312 12:19:44.679130 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:45.681343 master-0 kubenswrapper[4102]: I0312 12:19:45.681164 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:45.822981 master-0 kubenswrapper[4102]: E0312 12:19:45.822904 4102 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 12 12:19:46.677520 master-0 kubenswrapper[4102]: I0312 12:19:46.677414 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:47.169451 master-0 kubenswrapper[4102]: I0312 12:19:47.169348 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 12 12:19:47.193293 master-0 kubenswrapper[4102]: I0312 12:19:47.193235 4102 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 12:19:47.677113 master-0 kubenswrapper[4102]: I0312 12:19:47.677055 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:48.677046 master-0 kubenswrapper[4102]: I0312 12:19:48.676948 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:49.335030 master-0 kubenswrapper[4102]: E0312 12:19:49.334990 4102 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 12 12:19:49.562706 master-0 kubenswrapper[4102]: I0312 12:19:49.562595 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:49.564687 master-0 kubenswrapper[4102]: I0312 12:19:49.564621 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:49.564687 master-0 kubenswrapper[4102]: I0312 12:19:49.564682 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:49.564975 master-0 kubenswrapper[4102]: I0312 12:19:49.564704 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:49.564975 master-0 kubenswrapper[4102]: I0312 12:19:49.564785 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:49.569779 master-0 kubenswrapper[4102]: E0312 12:19:49.569738 4102 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 12 12:19:49.675561 master-0 kubenswrapper[4102]: I0312 12:19:49.675446 4102 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 12 12:19:50.285643 master-0 kubenswrapper[4102]: I0312 12:19:50.285582 4102 csr.go:261] certificate signing request csr-p9m5g is approved, waiting to be issued Mar 12 12:19:50.292920 master-0 kubenswrapper[4102]: I0312 12:19:50.292803 4102 csr.go:257] certificate signing request csr-p9m5g is issued Mar 12 12:19:50.558768 master-0 kubenswrapper[4102]: I0312 12:19:50.558616 4102 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 12 12:19:50.680869 master-0 kubenswrapper[4102]: I0312 12:19:50.680804 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:50.695276 master-0 kubenswrapper[4102]: I0312 12:19:50.695218 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:50.752734 master-0 kubenswrapper[4102]: I0312 12:19:50.752664 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.020861 master-0 kubenswrapper[4102]: I0312 12:19:51.020770 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.020861 master-0 kubenswrapper[4102]: E0312 12:19:51.020822 4102 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 12 12:19:51.040853 master-0 kubenswrapper[4102]: I0312 12:19:51.040769 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.056102 master-0 kubenswrapper[4102]: I0312 12:19:51.056005 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.110972 master-0 kubenswrapper[4102]: I0312 12:19:51.110906 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.295411 master-0 kubenswrapper[4102]: I0312 12:19:51.295216 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 09:32:20.21096341 +0000 UTC Mar 12 12:19:51.295411 master-0 kubenswrapper[4102]: I0312 12:19:51.295303 4102 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h12m28.915669621s for next certificate rotation Mar 12 12:19:51.382711 master-0 kubenswrapper[4102]: I0312 12:19:51.382627 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.382711 master-0 kubenswrapper[4102]: E0312 12:19:51.382667 4102 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 12 12:19:51.486625 master-0 kubenswrapper[4102]: I0312 12:19:51.486557 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.502927 master-0 kubenswrapper[4102]: I0312 12:19:51.502872 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.560470 master-0 kubenswrapper[4102]: I0312 12:19:51.560285 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.824034 master-0 kubenswrapper[4102]: I0312 12:19:51.823878 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:51.824034 master-0 kubenswrapper[4102]: E0312 12:19:51.823910 4102 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 12 12:19:51.839599 master-0 kubenswrapper[4102]: I0312 12:19:51.839455 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:51.841373 master-0 kubenswrapper[4102]: I0312 12:19:51.841305 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:51.841373 master-0 kubenswrapper[4102]: I0312 12:19:51.841368 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:51.841674 master-0 kubenswrapper[4102]: I0312 12:19:51.841390 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:51.841967 master-0 kubenswrapper[4102]: I0312 12:19:51.841920 4102 scope.go:117] "RemoveContainer" containerID="e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27" Mar 12 12:19:51.842192 master-0 kubenswrapper[4102]: E0312 12:19:51.842147 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 12 12:19:51.954364 master-0 kubenswrapper[4102]: I0312 12:19:51.954256 4102 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 12:19:52.393297 master-0 kubenswrapper[4102]: I0312 12:19:52.393233 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:52.409721 master-0 kubenswrapper[4102]: I0312 12:19:52.409684 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:52.466428 master-0 kubenswrapper[4102]: I0312 12:19:52.466348 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:52.728538 master-0 kubenswrapper[4102]: I0312 12:19:52.728448 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:52.728538 master-0 kubenswrapper[4102]: E0312 12:19:52.728535 4102 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 12 12:19:53.101686 master-0 kubenswrapper[4102]: I0312 12:19:53.101420 4102 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 12:19:53.880211 master-0 kubenswrapper[4102]: I0312 12:19:53.880131 4102 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 12:19:54.046024 master-0 kubenswrapper[4102]: I0312 12:19:54.045924 4102 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 12:19:54.965149 master-0 kubenswrapper[4102]: I0312 12:19:54.965018 4102 apiserver.go:52] "Watching apiserver" Mar 12 12:19:54.968331 master-0 kubenswrapper[4102]: I0312 12:19:54.968267 4102 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 12:19:54.968565 master-0 kubenswrapper[4102]: I0312 12:19:54.968443 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=[] Mar 12 12:19:54.976297 master-0 kubenswrapper[4102]: I0312 12:19:54.976234 4102 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 12 12:19:55.823924 master-0 kubenswrapper[4102]: E0312 12:19:55.823847 4102 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 12 12:19:56.245809 master-0 kubenswrapper[4102]: I0312 12:19:56.245729 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:56.261722 master-0 kubenswrapper[4102]: I0312 12:19:56.261650 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:56.319609 master-0 kubenswrapper[4102]: I0312 12:19:56.319448 4102 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 12 12:19:56.340621 master-0 kubenswrapper[4102]: E0312 12:19:56.340532 4102 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 12 12:19:56.570574 master-0 kubenswrapper[4102]: I0312 12:19:56.570390 4102 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:19:56.572111 master-0 kubenswrapper[4102]: I0312 12:19:56.572050 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:19:56.572205 master-0 kubenswrapper[4102]: I0312 12:19:56.572122 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:19:56.572205 master-0 kubenswrapper[4102]: I0312 12:19:56.572142 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:19:56.572334 master-0 kubenswrapper[4102]: I0312 12:19:56.572224 4102 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:19:56.583436 master-0 kubenswrapper[4102]: I0312 12:19:56.583358 4102 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 12 12:19:56.772623 master-0 kubenswrapper[4102]: I0312 12:19:56.772521 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-p2nlp"] Mar 12 12:19:56.772982 master-0 kubenswrapper[4102]: I0312 12:19:56.772881 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.774917 master-0 kubenswrapper[4102]: I0312 12:19:56.774860 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 12 12:19:56.775388 master-0 kubenswrapper[4102]: I0312 12:19:56.775337 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 12 12:19:56.775815 master-0 kubenswrapper[4102]: I0312 12:19:56.775769 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 12 12:19:56.776351 master-0 kubenswrapper[4102]: I0312 12:19:56.776317 4102 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 12 12:19:56.836955 master-0 kubenswrapper[4102]: I0312 12:19:56.836783 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-resolv-conf\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.837185 master-0 kubenswrapper[4102]: I0312 12:19:56.836865 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-ca-bundle\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.837545 master-0 kubenswrapper[4102]: I0312 12:19:56.837411 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/33be2f5b-c837-4a07-8ad9-4400a36f53c1-kube-api-access-bfw6l\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.837636 master-0 kubenswrapper[4102]: I0312 12:19:56.837561 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-sno-bootstrap-files\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.837706 master-0 kubenswrapper[4102]: I0312 12:19:56.837687 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-var-run-resolv-conf\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938285 master-0 kubenswrapper[4102]: I0312 12:19:56.938172 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-sno-bootstrap-files\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938285 master-0 kubenswrapper[4102]: I0312 12:19:56.938240 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-var-run-resolv-conf\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938326 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-var-run-resolv-conf\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938379 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-sno-bootstrap-files\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938448 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-resolv-conf\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938568 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-ca-bundle\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938582 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-resolv-conf\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938626 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/33be2f5b-c837-4a07-8ad9-4400a36f53c1-kube-api-access-bfw6l\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.938728 master-0 kubenswrapper[4102]: I0312 12:19:56.938657 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-ca-bundle\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.964117 master-0 kubenswrapper[4102]: I0312 12:19:56.964028 4102 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 12:19:56.965841 master-0 kubenswrapper[4102]: I0312 12:19:56.965770 4102 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 12 12:19:56.974978 master-0 kubenswrapper[4102]: I0312 12:19:56.974644 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/33be2f5b-c837-4a07-8ad9-4400a36f53c1-kube-api-access-bfw6l\") pod \"assisted-installer-controller-p2nlp\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:56.983004 master-0 kubenswrapper[4102]: I0312 12:19:56.982939 4102 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 12 12:19:57.103053 master-0 kubenswrapper[4102]: I0312 12:19:57.102871 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:19:57.120781 master-0 kubenswrapper[4102]: W0312 12:19:57.120712 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33be2f5b_c837_4a07_8ad9_4400a36f53c1.slice/crio-63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506 WatchSource:0}: Error finding container 63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506: Status 404 returned error can't find the container with id 63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506 Mar 12 12:19:57.270692 master-0 kubenswrapper[4102]: I0312 12:19:57.270602 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/network-operator-7c649bf6d4-rbb5m"] Mar 12 12:19:57.271652 master-0 kubenswrapper[4102]: I0312 12:19:57.270975 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.272739 master-0 kubenswrapper[4102]: I0312 12:19:57.272696 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 12:19:57.272843 master-0 kubenswrapper[4102]: I0312 12:19:57.272797 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 12:19:57.274024 master-0 kubenswrapper[4102]: I0312 12:19:57.273865 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 12:19:57.340812 master-0 kubenswrapper[4102]: I0312 12:19:57.340754 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.340812 master-0 kubenswrapper[4102]: I0312 12:19:57.340784 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.340812 master-0 kubenswrapper[4102]: I0312 12:19:57.340802 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.441147 master-0 kubenswrapper[4102]: I0312 12:19:57.440988 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.441147 master-0 kubenswrapper[4102]: I0312 12:19:57.441053 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.441541 master-0 kubenswrapper[4102]: I0312 12:19:57.441211 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.441541 master-0 kubenswrapper[4102]: I0312 12:19:57.441365 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.446437 master-0 kubenswrapper[4102]: I0312 12:19:57.446360 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.473387 master-0 kubenswrapper[4102]: I0312 12:19:57.473290 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.583143 master-0 kubenswrapper[4102]: I0312 12:19:57.583037 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:19:57.601356 master-0 kubenswrapper[4102]: W0312 12:19:57.601276 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ab511b_72e9_4fb9_b5de_770f49514369.slice/crio-5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81 WatchSource:0}: Error finding container 5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81: Status 404 returned error can't find the container with id 5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81 Mar 12 12:19:58.649617 master-0 kubenswrapper[4102]: I0312 12:19:58.649544 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" event={"ID":"61ab511b-72e9-4fb9-b5de-770f49514369","Type":"ContainerStarted","Data":"5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81"} Mar 12 12:19:58.653367 master-0 kubenswrapper[4102]: I0312 12:19:58.653307 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-p2nlp" event={"ID":"33be2f5b-c837-4a07-8ad9-4400a36f53c1","Type":"ContainerStarted","Data":"63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506"} Mar 12 12:19:58.727035 master-0 kubenswrapper[4102]: I0312 12:19:58.726966 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49"] Mar 12 12:19:58.727568 master-0 kubenswrapper[4102]: I0312 12:19:58.727532 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.730507 master-0 kubenswrapper[4102]: I0312 12:19:58.730295 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 12:19:58.730507 master-0 kubenswrapper[4102]: I0312 12:19:58.730328 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 12:19:58.730763 master-0 kubenswrapper[4102]: I0312 12:19:58.730698 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 12:19:58.851755 master-0 kubenswrapper[4102]: I0312 12:19:58.851717 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.851755 master-0 kubenswrapper[4102]: I0312 12:19:58.851761 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.851970 master-0 kubenswrapper[4102]: I0312 12:19:58.851781 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.851970 master-0 kubenswrapper[4102]: I0312 12:19:58.851819 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.851970 master-0 kubenswrapper[4102]: I0312 12:19:58.851844 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953052 master-0 kubenswrapper[4102]: I0312 12:19:58.953012 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953052 master-0 kubenswrapper[4102]: I0312 12:19:58.953053 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953248 master-0 kubenswrapper[4102]: I0312 12:19:58.953071 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953248 master-0 kubenswrapper[4102]: I0312 12:19:58.953118 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953366 master-0 kubenswrapper[4102]: E0312 12:19:58.953307 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:19:58.953433 master-0 kubenswrapper[4102]: I0312 12:19:58.953405 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953576 master-0 kubenswrapper[4102]: E0312 12:19:58.953539 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:19:59.453408459 +0000 UTC m=+44.386184874 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:19:58.953624 master-0 kubenswrapper[4102]: I0312 12:19:58.953587 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.953662 master-0 kubenswrapper[4102]: I0312 12:19:58.953632 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.954122 master-0 kubenswrapper[4102]: I0312 12:19:58.954095 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:58.982470 master-0 kubenswrapper[4102]: I0312 12:19:58.982432 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:59.456110 master-0 kubenswrapper[4102]: I0312 12:19:59.455797 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:19:59.456110 master-0 kubenswrapper[4102]: E0312 12:19:59.455999 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:19:59.457613 master-0 kubenswrapper[4102]: E0312 12:19:59.456533 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:00.456504786 +0000 UTC m=+45.389281211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:00.464706 master-0 kubenswrapper[4102]: I0312 12:20:00.464617 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:20:00.466576 master-0 kubenswrapper[4102]: E0312 12:20:00.464787 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:00.466576 master-0 kubenswrapper[4102]: E0312 12:20:00.464876 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:02.464856371 +0000 UTC m=+47.397632796 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:02.424812 master-0 kubenswrapper[4102]: I0312 12:20:02.424537 4102 csr.go:261] certificate signing request csr-n946r is approved, waiting to be issued Mar 12 12:20:02.477629 master-0 kubenswrapper[4102]: I0312 12:20:02.477519 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:20:02.477889 master-0 kubenswrapper[4102]: E0312 12:20:02.477706 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:02.477889 master-0 kubenswrapper[4102]: E0312 12:20:02.477787 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:06.47776438 +0000 UTC m=+51.410540805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:02.553973 master-0 kubenswrapper[4102]: I0312 12:20:02.553918 4102 csr.go:257] certificate signing request csr-n946r is issued Mar 12 12:20:03.556032 master-0 kubenswrapper[4102]: I0312 12:20:03.555914 4102 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 07:04:31.479191658 +0000 UTC Mar 12 12:20:03.556032 master-0 kubenswrapper[4102]: I0312 12:20:03.555963 4102 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h44m27.923231277s for next certificate rotation Mar 12 12:20:04.557149 master-0 kubenswrapper[4102]: I0312 12:20:04.557074 4102 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 06:53:12.160643219 +0000 UTC Mar 12 12:20:04.557149 master-0 kubenswrapper[4102]: I0312 12:20:04.557130 4102 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h33m7.603517935s for next certificate rotation Mar 12 12:20:05.691888 master-0 kubenswrapper[4102]: I0312 12:20:05.691796 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" event={"ID":"61ab511b-72e9-4fb9-b5de-770f49514369","Type":"ContainerStarted","Data":"630c088f3826f86c1fe389a213d79d0dfdd3c10669dd76b8de7210253f979c04"} Mar 12 12:20:05.694220 master-0 kubenswrapper[4102]: I0312 12:20:05.694068 4102 generic.go:334] "Generic (PLEG): container finished" podID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerID="d6ff10b313edaeab618ba4bd948891faf24292c7fee20c8b60ece8104bb06b3a" exitCode=0 Mar 12 12:20:05.694220 master-0 kubenswrapper[4102]: I0312 12:20:05.694129 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-p2nlp" event={"ID":"33be2f5b-c837-4a07-8ad9-4400a36f53c1","Type":"ContainerDied","Data":"d6ff10b313edaeab618ba4bd948891faf24292c7fee20c8b60ece8104bb06b3a"} Mar 12 12:20:05.730675 master-0 kubenswrapper[4102]: I0312 12:20:05.730574 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" podStartSLOduration=1.144921798 podStartE2EDuration="8.730551277s" podCreationTimestamp="2026-03-12 12:19:57 +0000 UTC" firstStartedPulling="2026-03-12 12:19:57.6050756 +0000 UTC m=+42.537852025" lastFinishedPulling="2026-03-12 12:20:05.190705089 +0000 UTC m=+50.123481504" observedRunningTime="2026-03-12 12:20:05.710989018 +0000 UTC m=+50.643765433" watchObservedRunningTime="2026-03-12 12:20:05.730551277 +0000 UTC m=+50.663327702" Mar 12 12:20:06.516255 master-0 kubenswrapper[4102]: I0312 12:20:06.516180 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:20:06.516697 master-0 kubenswrapper[4102]: E0312 12:20:06.516558 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:06.517018 master-0 kubenswrapper[4102]: E0312 12:20:06.516986 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:14.516939399 +0000 UTC m=+59.449715844 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:06.722282 master-0 kubenswrapper[4102]: I0312 12:20:06.722212 4102 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:20:06.819018 master-0 kubenswrapper[4102]: I0312 12:20:06.818812 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-sno-bootstrap-files\") pod \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " Mar 12 12:20:06.819018 master-0 kubenswrapper[4102]: I0312 12:20:06.818918 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-ca-bundle\") pod \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " Mar 12 12:20:06.819018 master-0 kubenswrapper[4102]: I0312 12:20:06.818969 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-resolv-conf\") pod \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " Mar 12 12:20:06.819018 master-0 kubenswrapper[4102]: I0312 12:20:06.818987 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "33be2f5b-c837-4a07-8ad9-4400a36f53c1" (UID: "33be2f5b-c837-4a07-8ad9-4400a36f53c1"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819033 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/33be2f5b-c837-4a07-8ad9-4400a36f53c1-kube-api-access-bfw6l\") pod \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819038 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "33be2f5b-c837-4a07-8ad9-4400a36f53c1" (UID: "33be2f5b-c837-4a07-8ad9-4400a36f53c1"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819074 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "33be2f5b-c837-4a07-8ad9-4400a36f53c1" (UID: "33be2f5b-c837-4a07-8ad9-4400a36f53c1"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819076 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-var-run-resolv-conf\") pod \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\" (UID: \"33be2f5b-c837-4a07-8ad9-4400a36f53c1\") " Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819149 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "33be2f5b-c837-4a07-8ad9-4400a36f53c1" (UID: "33be2f5b-c837-4a07-8ad9-4400a36f53c1"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819292 4102 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819320 4102 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819345 4102 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:06.819543 master-0 kubenswrapper[4102]: I0312 12:20:06.819369 4102 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/33be2f5b-c837-4a07-8ad9-4400a36f53c1-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:06.824815 master-0 kubenswrapper[4102]: I0312 12:20:06.824753 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33be2f5b-c837-4a07-8ad9-4400a36f53c1-kube-api-access-bfw6l" (OuterVolumeSpecName: "kube-api-access-bfw6l") pod "33be2f5b-c837-4a07-8ad9-4400a36f53c1" (UID: "33be2f5b-c837-4a07-8ad9-4400a36f53c1"). InnerVolumeSpecName "kube-api-access-bfw6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:20:06.920373 master-0 kubenswrapper[4102]: I0312 12:20:06.920286 4102 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfw6l\" (UniqueName: \"kubernetes.io/projected/33be2f5b-c837-4a07-8ad9-4400a36f53c1-kube-api-access-bfw6l\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:07.505222 master-0 kubenswrapper[4102]: I0312 12:20:07.505172 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 12 12:20:07.505795 master-0 kubenswrapper[4102]: I0312 12:20:07.505776 4102 scope.go:117] "RemoveContainer" containerID="e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27" Mar 12 12:20:07.635752 master-0 kubenswrapper[4102]: I0312 12:20:07.635664 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-tv2nf"] Mar 12 12:20:07.635960 master-0 kubenswrapper[4102]: E0312 12:20:07.635768 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:20:07.635960 master-0 kubenswrapper[4102]: I0312 12:20:07.635782 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:20:07.635960 master-0 kubenswrapper[4102]: I0312 12:20:07.635800 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:20:07.636087 master-0 kubenswrapper[4102]: I0312 12:20:07.635970 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:07.699110 master-0 kubenswrapper[4102]: I0312 12:20:07.698833 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-p2nlp" event={"ID":"33be2f5b-c837-4a07-8ad9-4400a36f53c1","Type":"ContainerDied","Data":"63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506"} Mar 12 12:20:07.699110 master-0 kubenswrapper[4102]: I0312 12:20:07.699110 4102 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506" Mar 12 12:20:07.699246 master-0 kubenswrapper[4102]: I0312 12:20:07.699178 4102 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:20:07.726866 master-0 kubenswrapper[4102]: I0312 12:20:07.726823 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plr2f\" (UniqueName: \"kubernetes.io/projected/cce66fc4-350d-4a86-acb2-d8d672cf2491-kube-api-access-plr2f\") pod \"mtu-prober-tv2nf\" (UID: \"cce66fc4-350d-4a86-acb2-d8d672cf2491\") " pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:07.827885 master-0 kubenswrapper[4102]: I0312 12:20:07.827770 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plr2f\" (UniqueName: \"kubernetes.io/projected/cce66fc4-350d-4a86-acb2-d8d672cf2491-kube-api-access-plr2f\") pod \"mtu-prober-tv2nf\" (UID: \"cce66fc4-350d-4a86-acb2-d8d672cf2491\") " pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:07.844942 master-0 kubenswrapper[4102]: I0312 12:20:07.844881 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plr2f\" (UniqueName: \"kubernetes.io/projected/cce66fc4-350d-4a86-acb2-d8d672cf2491-kube-api-access-plr2f\") pod \"mtu-prober-tv2nf\" (UID: \"cce66fc4-350d-4a86-acb2-d8d672cf2491\") " pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:07.949845 master-0 kubenswrapper[4102]: I0312 12:20:07.949792 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:07.965346 master-0 kubenswrapper[4102]: W0312 12:20:07.965278 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcce66fc4_350d_4a86_acb2_d8d672cf2491.slice/crio-68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2 WatchSource:0}: Error finding container 68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2: Status 404 returned error can't find the container with id 68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2 Mar 12 12:20:08.704935 master-0 kubenswrapper[4102]: I0312 12:20:08.704891 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 12 12:20:08.706183 master-0 kubenswrapper[4102]: I0312 12:20:08.706135 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a732af73f8df5350fffca3678cce89972e48b40d0d2aaf3a11513ec460452d56"} Mar 12 12:20:08.708966 master-0 kubenswrapper[4102]: I0312 12:20:08.708915 4102 generic.go:334] "Generic (PLEG): container finished" podID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerID="823aef6fae9a1e0ac9ce3e87b09c4b094495f36691d261ce34a1a0d40c54755e" exitCode=0 Mar 12 12:20:08.709162 master-0 kubenswrapper[4102]: I0312 12:20:08.708980 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-tv2nf" event={"ID":"cce66fc4-350d-4a86-acb2-d8d672cf2491","Type":"ContainerDied","Data":"823aef6fae9a1e0ac9ce3e87b09c4b094495f36691d261ce34a1a0d40c54755e"} Mar 12 12:20:08.709317 master-0 kubenswrapper[4102]: I0312 12:20:08.709289 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-tv2nf" event={"ID":"cce66fc4-350d-4a86-acb2-d8d672cf2491","Type":"ContainerStarted","Data":"68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2"} Mar 12 12:20:08.722473 master-0 kubenswrapper[4102]: I0312 12:20:08.722414 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=1.722392345 podStartE2EDuration="1.722392345s" podCreationTimestamp="2026-03-12 12:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:20:08.722251162 +0000 UTC m=+53.655027617" watchObservedRunningTime="2026-03-12 12:20:08.722392345 +0000 UTC m=+53.655168800" Mar 12 12:20:09.730252 master-0 kubenswrapper[4102]: I0312 12:20:09.730207 4102 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:09.841938 master-0 kubenswrapper[4102]: I0312 12:20:09.841872 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plr2f\" (UniqueName: \"kubernetes.io/projected/cce66fc4-350d-4a86-acb2-d8d672cf2491-kube-api-access-plr2f\") pod \"cce66fc4-350d-4a86-acb2-d8d672cf2491\" (UID: \"cce66fc4-350d-4a86-acb2-d8d672cf2491\") " Mar 12 12:20:09.846392 master-0 kubenswrapper[4102]: I0312 12:20:09.846317 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cce66fc4-350d-4a86-acb2-d8d672cf2491-kube-api-access-plr2f" (OuterVolumeSpecName: "kube-api-access-plr2f") pod "cce66fc4-350d-4a86-acb2-d8d672cf2491" (UID: "cce66fc4-350d-4a86-acb2-d8d672cf2491"). InnerVolumeSpecName "kube-api-access-plr2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:20:09.942454 master-0 kubenswrapper[4102]: I0312 12:20:09.942393 4102 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plr2f\" (UniqueName: \"kubernetes.io/projected/cce66fc4-350d-4a86-acb2-d8d672cf2491-kube-api-access-plr2f\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:10.716806 master-0 kubenswrapper[4102]: I0312 12:20:10.716709 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-tv2nf" event={"ID":"cce66fc4-350d-4a86-acb2-d8d672cf2491","Type":"ContainerDied","Data":"68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2"} Mar 12 12:20:10.716806 master-0 kubenswrapper[4102]: I0312 12:20:10.716770 4102 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2" Mar 12 12:20:10.717623 master-0 kubenswrapper[4102]: I0312 12:20:10.716810 4102 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-tv2nf" Mar 12 12:20:12.591509 master-0 kubenswrapper[4102]: I0312 12:20:12.591452 4102 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-tv2nf"] Mar 12 12:20:12.601533 master-0 kubenswrapper[4102]: I0312 12:20:12.601265 4102 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-tv2nf"] Mar 12 12:20:13.847047 master-0 kubenswrapper[4102]: I0312 12:20:13.846942 4102 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" path="/var/lib/kubelet/pods/cce66fc4-350d-4a86-acb2-d8d672cf2491/volumes" Mar 12 12:20:14.576028 master-0 kubenswrapper[4102]: I0312 12:20:14.575960 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:20:14.576313 master-0 kubenswrapper[4102]: E0312 12:20:14.576143 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:14.576313 master-0 kubenswrapper[4102]: E0312 12:20:14.576255 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:30.576221709 +0000 UTC m=+75.508998164 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:17.473512 master-0 kubenswrapper[4102]: I0312 12:20:17.473409 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-hb48g"] Mar 12 12:20:17.474870 master-0 kubenswrapper[4102]: E0312 12:20:17.474802 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:20:17.474870 master-0 kubenswrapper[4102]: I0312 12:20:17.474854 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:20:17.475008 master-0 kubenswrapper[4102]: I0312 12:20:17.474890 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:20:17.475221 master-0 kubenswrapper[4102]: I0312 12:20:17.475181 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.477866 master-0 kubenswrapper[4102]: I0312 12:20:17.477710 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 12:20:17.478071 master-0 kubenswrapper[4102]: I0312 12:20:17.478033 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 12:20:17.478307 master-0 kubenswrapper[4102]: I0312 12:20:17.478202 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 12:20:17.478965 master-0 kubenswrapper[4102]: I0312 12:20:17.478929 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 12:20:17.595966 master-0 kubenswrapper[4102]: I0312 12:20:17.595903 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.596444 master-0 kubenswrapper[4102]: I0312 12:20:17.596397 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.596777 master-0 kubenswrapper[4102]: I0312 12:20:17.596746 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.597019 master-0 kubenswrapper[4102]: I0312 12:20:17.596965 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.597269 master-0 kubenswrapper[4102]: I0312 12:20:17.597231 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.597589 master-0 kubenswrapper[4102]: I0312 12:20:17.597555 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.597850 master-0 kubenswrapper[4102]: I0312 12:20:17.597816 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.598074 master-0 kubenswrapper[4102]: I0312 12:20:17.598048 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.598246 master-0 kubenswrapper[4102]: I0312 12:20:17.598223 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.598446 master-0 kubenswrapper[4102]: I0312 12:20:17.598421 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.598651 master-0 kubenswrapper[4102]: I0312 12:20:17.598628 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.599044 master-0 kubenswrapper[4102]: I0312 12:20:17.598820 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.599217 master-0 kubenswrapper[4102]: I0312 12:20:17.599193 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.599420 master-0 kubenswrapper[4102]: I0312 12:20:17.599385 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.600434 master-0 kubenswrapper[4102]: I0312 12:20:17.600404 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.600944 master-0 kubenswrapper[4102]: I0312 12:20:17.600918 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.601120 master-0 kubenswrapper[4102]: I0312 12:20:17.601096 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702023 master-0 kubenswrapper[4102]: I0312 12:20:17.701957 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702322 master-0 kubenswrapper[4102]: I0312 12:20:17.702294 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702461 master-0 kubenswrapper[4102]: I0312 12:20:17.702062 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702676 master-0 kubenswrapper[4102]: I0312 12:20:17.702609 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702676 master-0 kubenswrapper[4102]: I0312 12:20:17.702535 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702676 master-0 kubenswrapper[4102]: I0312 12:20:17.702567 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702890 master-0 kubenswrapper[4102]: I0312 12:20:17.702695 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702890 master-0 kubenswrapper[4102]: I0312 12:20:17.702727 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702890 master-0 kubenswrapper[4102]: I0312 12:20:17.702758 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702890 master-0 kubenswrapper[4102]: I0312 12:20:17.702788 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.702890 master-0 kubenswrapper[4102]: I0312 12:20:17.702881 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703169 master-0 kubenswrapper[4102]: I0312 12:20:17.702924 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703169 master-0 kubenswrapper[4102]: I0312 12:20:17.703039 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703169 master-0 kubenswrapper[4102]: I0312 12:20:17.703074 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703169 master-0 kubenswrapper[4102]: I0312 12:20:17.703106 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703169 master-0 kubenswrapper[4102]: I0312 12:20:17.703134 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703169 master-0 kubenswrapper[4102]: I0312 12:20:17.703165 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703623 master-0 kubenswrapper[4102]: I0312 12:20:17.703198 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703623 master-0 kubenswrapper[4102]: I0312 12:20:17.703226 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703623 master-0 kubenswrapper[4102]: I0312 12:20:17.703259 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703623 master-0 kubenswrapper[4102]: I0312 12:20:17.703287 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703623 master-0 kubenswrapper[4102]: I0312 12:20:17.703315 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.703623 master-0 kubenswrapper[4102]: I0312 12:20:17.703344 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704094 master-0 kubenswrapper[4102]: I0312 12:20:17.703824 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704094 master-0 kubenswrapper[4102]: I0312 12:20:17.704057 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704333 master-0 kubenswrapper[4102]: I0312 12:20:17.704118 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704333 master-0 kubenswrapper[4102]: I0312 12:20:17.704189 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704333 master-0 kubenswrapper[4102]: I0312 12:20:17.704252 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704333 master-0 kubenswrapper[4102]: I0312 12:20:17.704295 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704584 master-0 kubenswrapper[4102]: I0312 12:20:17.704319 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704584 master-0 kubenswrapper[4102]: I0312 12:20:17.704370 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.704584 master-0 kubenswrapper[4102]: I0312 12:20:17.704429 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.705240 master-0 kubenswrapper[4102]: I0312 12:20:17.705174 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.960747 master-0 kubenswrapper[4102]: I0312 12:20:17.960698 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:20:17.990961 master-0 kubenswrapper[4102]: I0312 12:20:17.990916 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-r86hc"] Mar 12 12:20:17.991954 master-0 kubenswrapper[4102]: I0312 12:20:17.991925 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:17.994020 master-0 kubenswrapper[4102]: I0312 12:20:17.993979 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 12 12:20:17.994505 master-0 kubenswrapper[4102]: I0312 12:20:17.994448 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 12:20:18.102286 master-0 kubenswrapper[4102]: I0312 12:20:18.102195 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-hb48g" Mar 12 12:20:18.105440 master-0 kubenswrapper[4102]: I0312 12:20:18.105380 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105440 master-0 kubenswrapper[4102]: I0312 12:20:18.105425 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105705 master-0 kubenswrapper[4102]: I0312 12:20:18.105452 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105705 master-0 kubenswrapper[4102]: I0312 12:20:18.105538 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105705 master-0 kubenswrapper[4102]: I0312 12:20:18.105578 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105705 master-0 kubenswrapper[4102]: I0312 12:20:18.105602 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105705 master-0 kubenswrapper[4102]: I0312 12:20:18.105621 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.105705 master-0 kubenswrapper[4102]: I0312 12:20:18.105644 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.118713 master-0 kubenswrapper[4102]: W0312 12:20:18.118649 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod666857a1_0ddf_4b48_91f4_44cce154d1b1.slice/crio-aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79 WatchSource:0}: Error finding container aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79: Status 404 returned error can't find the container with id aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79 Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206308 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206374 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206399 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206422 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206446 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206471 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206541 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.206631 master-0 kubenswrapper[4102]: I0312 12:20:18.206580 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.207517 master-0 kubenswrapper[4102]: I0312 12:20:18.206724 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.207517 master-0 kubenswrapper[4102]: I0312 12:20:18.207009 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.207517 master-0 kubenswrapper[4102]: I0312 12:20:18.207313 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.207816 master-0 kubenswrapper[4102]: I0312 12:20:18.207717 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.208602 master-0 kubenswrapper[4102]: I0312 12:20:18.208547 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.208731 master-0 kubenswrapper[4102]: I0312 12:20:18.208607 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.208887 master-0 kubenswrapper[4102]: I0312 12:20:18.208837 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.239471 master-0 kubenswrapper[4102]: I0312 12:20:18.239331 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.310635 master-0 kubenswrapper[4102]: I0312 12:20:18.310561 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:20:18.323257 master-0 kubenswrapper[4102]: W0312 12:20:18.323181 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10498208_0692_4533_b672_a7a2cfcdf1be.slice/crio-5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c WatchSource:0}: Error finding container 5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c: Status 404 returned error can't find the container with id 5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c Mar 12 12:20:18.458411 master-0 kubenswrapper[4102]: I0312 12:20:18.458353 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4m9jh"] Mar 12 12:20:18.459228 master-0 kubenswrapper[4102]: I0312 12:20:18.459189 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:18.459635 master-0 kubenswrapper[4102]: E0312 12:20:18.459550 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:18.608948 master-0 kubenswrapper[4102]: I0312 12:20:18.608737 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:18.608948 master-0 kubenswrapper[4102]: I0312 12:20:18.608798 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:18.709860 master-0 kubenswrapper[4102]: I0312 12:20:18.709744 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:18.710172 master-0 kubenswrapper[4102]: I0312 12:20:18.710101 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:18.710422 master-0 kubenswrapper[4102]: E0312 12:20:18.710371 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:18.710589 master-0 kubenswrapper[4102]: E0312 12:20:18.710551 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:20:19.210453878 +0000 UTC m=+64.143230293 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:18.735984 master-0 kubenswrapper[4102]: I0312 12:20:18.735873 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hb48g" event={"ID":"666857a1-0ddf-4b48-91f4-44cce154d1b1","Type":"ContainerStarted","Data":"aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79"} Mar 12 12:20:18.737328 master-0 kubenswrapper[4102]: I0312 12:20:18.737266 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerStarted","Data":"5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c"} Mar 12 12:20:18.739030 master-0 kubenswrapper[4102]: I0312 12:20:18.738992 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:19.214846 master-0 kubenswrapper[4102]: I0312 12:20:19.214794 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:19.215063 master-0 kubenswrapper[4102]: E0312 12:20:19.214948 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:19.215063 master-0 kubenswrapper[4102]: E0312 12:20:19.215004 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:20:20.214986189 +0000 UTC m=+65.147762604 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:19.839250 master-0 kubenswrapper[4102]: I0312 12:20:19.839163 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:19.840037 master-0 kubenswrapper[4102]: E0312 12:20:19.839294 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:20.223528 master-0 kubenswrapper[4102]: I0312 12:20:20.223447 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:20.223774 master-0 kubenswrapper[4102]: E0312 12:20:20.223644 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:20.223774 master-0 kubenswrapper[4102]: E0312 12:20:20.223739 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:20:22.223718202 +0000 UTC m=+67.156494627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:21.839081 master-0 kubenswrapper[4102]: I0312 12:20:21.839032 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:21.840494 master-0 kubenswrapper[4102]: E0312 12:20:21.839364 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:22.241535 master-0 kubenswrapper[4102]: I0312 12:20:22.241463 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:22.241726 master-0 kubenswrapper[4102]: E0312 12:20:22.241654 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:22.241760 master-0 kubenswrapper[4102]: E0312 12:20:22.241719 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:20:26.241702164 +0000 UTC m=+71.174478579 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:22.746705 master-0 kubenswrapper[4102]: I0312 12:20:22.746419 4102 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="05ebbc8f6ffb604ea4cb658572a6553a226fea47b2132dba48a9b8a612eeb8a1" exitCode=0 Mar 12 12:20:22.746705 master-0 kubenswrapper[4102]: I0312 12:20:22.746682 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerDied","Data":"05ebbc8f6ffb604ea4cb658572a6553a226fea47b2132dba48a9b8a612eeb8a1"} Mar 12 12:20:23.839104 master-0 kubenswrapper[4102]: I0312 12:20:23.838976 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:23.840432 master-0 kubenswrapper[4102]: E0312 12:20:23.839172 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:25.841555 master-0 kubenswrapper[4102]: I0312 12:20:25.841494 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:25.842060 master-0 kubenswrapper[4102]: E0312 12:20:25.841585 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:26.788357 master-0 kubenswrapper[4102]: I0312 12:20:26.785885 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:26.788357 master-0 kubenswrapper[4102]: E0312 12:20:26.786126 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:26.788357 master-0 kubenswrapper[4102]: E0312 12:20:26.786206 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:20:34.786175082 +0000 UTC m=+79.718951497 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:27.871212 master-0 kubenswrapper[4102]: I0312 12:20:27.871122 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:27.872174 master-0 kubenswrapper[4102]: E0312 12:20:27.871260 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:29.839568 master-0 kubenswrapper[4102]: I0312 12:20:29.839502 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:29.840163 master-0 kubenswrapper[4102]: E0312 12:20:29.839723 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:29.875024 master-0 kubenswrapper[4102]: I0312 12:20:29.874926 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv"] Mar 12 12:20:29.875427 master-0 kubenswrapper[4102]: I0312 12:20:29.875392 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.879509 master-0 kubenswrapper[4102]: I0312 12:20:29.879168 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 12:20:29.879509 master-0 kubenswrapper[4102]: I0312 12:20:29.879198 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 12:20:29.879509 master-0 kubenswrapper[4102]: I0312 12:20:29.879248 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 12:20:29.879509 master-0 kubenswrapper[4102]: I0312 12:20:29.879345 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 12:20:29.881621 master-0 kubenswrapper[4102]: I0312 12:20:29.880930 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 12:20:29.886003 master-0 kubenswrapper[4102]: I0312 12:20:29.885676 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.886003 master-0 kubenswrapper[4102]: I0312 12:20:29.885739 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.886003 master-0 kubenswrapper[4102]: I0312 12:20:29.885804 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.886003 master-0 kubenswrapper[4102]: I0312 12:20:29.885884 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.986356 master-0 kubenswrapper[4102]: I0312 12:20:29.986283 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.986356 master-0 kubenswrapper[4102]: I0312 12:20:29.986363 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.988810 master-0 kubenswrapper[4102]: I0312 12:20:29.986597 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.988810 master-0 kubenswrapper[4102]: I0312 12:20:29.986659 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.988810 master-0 kubenswrapper[4102]: I0312 12:20:29.987403 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.988810 master-0 kubenswrapper[4102]: I0312 12:20:29.987752 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:29.999719 master-0 kubenswrapper[4102]: I0312 12:20:29.991683 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:30.004105 master-0 kubenswrapper[4102]: I0312 12:20:30.004062 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:30.090397 master-0 kubenswrapper[4102]: I0312 12:20:30.090245 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsbxb"] Mar 12 12:20:30.090896 master-0 kubenswrapper[4102]: I0312 12:20:30.090868 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.093192 master-0 kubenswrapper[4102]: I0312 12:20:30.093159 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 12:20:30.094286 master-0 kubenswrapper[4102]: I0312 12:20:30.094079 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 12:20:30.191529 master-0 kubenswrapper[4102]: I0312 12:20:30.191437 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:20:30.191993 master-0 kubenswrapper[4102]: I0312 12:20:30.191971 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-config\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192083 master-0 kubenswrapper[4102]: I0312 12:20:30.191996 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-script-lib\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192083 master-0 kubenswrapper[4102]: I0312 12:20:30.192016 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-netd\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192083 master-0 kubenswrapper[4102]: I0312 12:20:30.192047 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-var-lib-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192083 master-0 kubenswrapper[4102]: I0312 12:20:30.192076 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192098 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-bin\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192155 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-systemd-units\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192202 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-netns\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192224 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192257 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovn-node-metrics-cert\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192274 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-kubelet\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192292 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192311 master-0 kubenswrapper[4102]: I0312 12:20:30.192307 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfmp5\" (UniqueName: \"kubernetes.io/projected/b7b07b83-cade-4b52-b568-5c7e9d7c496b-kube-api-access-mfmp5\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192321 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-etc-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192335 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-ovn\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192361 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-env-overrides\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192379 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-node-log\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192395 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-systemd\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192410 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-log-socket\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.192820 master-0 kubenswrapper[4102]: I0312 12:20:30.192437 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-slash\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.206787 master-0 kubenswrapper[4102]: W0312 12:20:30.206679 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f3a291a_d9af_4e0f_a307_8928e4dc523d.slice/crio-ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e WatchSource:0}: Error finding container ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e: Status 404 returned error can't find the container with id ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e Mar 12 12:20:30.293918 master-0 kubenswrapper[4102]: I0312 12:20:30.293840 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-etc-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.293918 master-0 kubenswrapper[4102]: I0312 12:20:30.293913 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-ovn\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.293918 master-0 kubenswrapper[4102]: I0312 12:20:30.293935 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfmp5\" (UniqueName: \"kubernetes.io/projected/b7b07b83-cade-4b52-b568-5c7e9d7c496b-kube-api-access-mfmp5\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.293971 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-env-overrides\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.294060 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-etc-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.294159 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-node-log\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.294190 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-systemd\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.294184 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-ovn\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.294214 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-log-socket\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.294257 master-0 kubenswrapper[4102]: I0312 12:20:30.294259 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-log-socket\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294272 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-node-log\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294311 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-slash\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294330 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-systemd\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294346 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-config\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294371 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-script-lib\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294371 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-slash\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294651 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-netd\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294686 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-var-lib-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294697 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-netd\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294713 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-systemd-units\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294752 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-var-lib-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294754 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-systemd-units\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294790 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-netns\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294819 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-netns\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294822 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294848 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-bin\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295071 master-0 kubenswrapper[4102]: I0312 12:20:30.294873 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.294882 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-openvswitch\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.294909 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovn-node-metrics-cert\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.294971 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-bin\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.294982 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-kubelet\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295031 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295109 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-ovn-kubernetes\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295157 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295197 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-kubelet\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295202 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-env-overrides\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295592 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-script-lib\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.295982 master-0 kubenswrapper[4102]: I0312 12:20:30.295919 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-config\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.298934 master-0 kubenswrapper[4102]: I0312 12:20:30.298893 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovn-node-metrics-cert\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.311311 master-0 kubenswrapper[4102]: I0312 12:20:30.311271 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfmp5\" (UniqueName: \"kubernetes.io/projected/b7b07b83-cade-4b52-b568-5c7e9d7c496b-kube-api-access-mfmp5\") pod \"ovnkube-node-xsbxb\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.403347 master-0 kubenswrapper[4102]: I0312 12:20:30.403308 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:30.413614 master-0 kubenswrapper[4102]: W0312 12:20:30.412491 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7b07b83_cade_4b52_b568_5c7e9d7c496b.slice/crio-b741cbd42bcb7d841acb6d1567fee58839a8dcf67f2d4afb5121784d68ade860 WatchSource:0}: Error finding container b741cbd42bcb7d841acb6d1567fee58839a8dcf67f2d4afb5121784d68ade860: Status 404 returned error can't find the container with id b741cbd42bcb7d841acb6d1567fee58839a8dcf67f2d4afb5121784d68ade860 Mar 12 12:20:30.598572 master-0 kubenswrapper[4102]: I0312 12:20:30.598454 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:20:30.598746 master-0 kubenswrapper[4102]: E0312 12:20:30.598666 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:30.598789 master-0 kubenswrapper[4102]: E0312 12:20:30.598767 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:02.598734026 +0000 UTC m=+107.531510451 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:20:30.885051 master-0 kubenswrapper[4102]: I0312 12:20:30.884979 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"b741cbd42bcb7d841acb6d1567fee58839a8dcf67f2d4afb5121784d68ade860"} Mar 12 12:20:30.886398 master-0 kubenswrapper[4102]: I0312 12:20:30.886350 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" event={"ID":"2f3a291a-d9af-4e0f-a307-8928e4dc523d","Type":"ContainerStarted","Data":"f1fd5d6d3b3585097cce20bb0e0dcbc5c960bcbae48acf32800a8e41dcc361b2"} Mar 12 12:20:30.886461 master-0 kubenswrapper[4102]: I0312 12:20:30.886402 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" event={"ID":"2f3a291a-d9af-4e0f-a307-8928e4dc523d","Type":"ContainerStarted","Data":"ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e"} Mar 12 12:20:31.838889 master-0 kubenswrapper[4102]: I0312 12:20:31.838801 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:31.839581 master-0 kubenswrapper[4102]: E0312 12:20:31.839465 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:31.850888 master-0 kubenswrapper[4102]: W0312 12:20:31.850791 4102 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 12 12:20:31.851031 master-0 kubenswrapper[4102]: I0312 12:20:31.850977 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 12 12:20:32.895215 master-0 kubenswrapper[4102]: I0312 12:20:32.895171 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerStarted","Data":"9b70d60ef3db50f988d7297005c87ae9142093113f8ee25c0d2a4d1f3023050e"} Mar 12 12:20:32.910209 master-0 kubenswrapper[4102]: I0312 12:20:32.910111 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=1.9100731039999999 podStartE2EDuration="1.910073104s" podCreationTimestamp="2026-03-12 12:20:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:20:32.909457829 +0000 UTC m=+77.842234254" watchObservedRunningTime="2026-03-12 12:20:32.910073104 +0000 UTC m=+77.842849519" Mar 12 12:20:33.543673 master-0 kubenswrapper[4102]: I0312 12:20:33.543472 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-dfz7x"] Mar 12 12:20:33.543978 master-0 kubenswrapper[4102]: I0312 12:20:33.543807 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:33.543978 master-0 kubenswrapper[4102]: E0312 12:20:33.543872 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:33.624786 master-0 kubenswrapper[4102]: I0312 12:20:33.624722 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:33.725573 master-0 kubenswrapper[4102]: I0312 12:20:33.725446 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:33.739345 master-0 kubenswrapper[4102]: E0312 12:20:33.738996 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:20:33.739345 master-0 kubenswrapper[4102]: E0312 12:20:33.739029 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:20:33.739345 master-0 kubenswrapper[4102]: E0312 12:20:33.739043 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:33.739345 master-0 kubenswrapper[4102]: E0312 12:20:33.739133 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:34.239106347 +0000 UTC m=+79.171882762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:33.839584 master-0 kubenswrapper[4102]: I0312 12:20:33.838784 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:33.839584 master-0 kubenswrapper[4102]: E0312 12:20:33.838948 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:34.331397 master-0 kubenswrapper[4102]: I0312 12:20:34.331324 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:34.331955 master-0 kubenswrapper[4102]: E0312 12:20:34.331520 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:20:34.331955 master-0 kubenswrapper[4102]: E0312 12:20:34.331539 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:20:34.331955 master-0 kubenswrapper[4102]: E0312 12:20:34.331552 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:34.331955 master-0 kubenswrapper[4102]: E0312 12:20:34.331601 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:35.331584618 +0000 UTC m=+80.264361033 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:34.835920 master-0 kubenswrapper[4102]: I0312 12:20:34.835809 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:34.836189 master-0 kubenswrapper[4102]: E0312 12:20:34.836012 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:34.836189 master-0 kubenswrapper[4102]: E0312 12:20:34.836094 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:20:50.836075448 +0000 UTC m=+95.768851853 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:35.339513 master-0 kubenswrapper[4102]: I0312 12:20:35.339369 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:35.340306 master-0 kubenswrapper[4102]: E0312 12:20:35.339590 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:20:35.340306 master-0 kubenswrapper[4102]: E0312 12:20:35.339618 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:20:35.340306 master-0 kubenswrapper[4102]: E0312 12:20:35.339634 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:35.340306 master-0 kubenswrapper[4102]: E0312 12:20:35.339685 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:37.339669537 +0000 UTC m=+82.272445972 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:35.839033 master-0 kubenswrapper[4102]: I0312 12:20:35.838751 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:35.839033 master-0 kubenswrapper[4102]: I0312 12:20:35.838754 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:35.839321 master-0 kubenswrapper[4102]: E0312 12:20:35.839253 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:35.839440 master-0 kubenswrapper[4102]: E0312 12:20:35.839411 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:36.723732 master-0 kubenswrapper[4102]: I0312 12:20:36.723626 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-rzmhl"] Mar 12 12:20:36.724521 master-0 kubenswrapper[4102]: I0312 12:20:36.724444 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.729587 master-0 kubenswrapper[4102]: I0312 12:20:36.729242 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 12:20:36.729587 master-0 kubenswrapper[4102]: I0312 12:20:36.729312 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 12:20:36.729587 master-0 kubenswrapper[4102]: I0312 12:20:36.729326 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 12:20:36.730603 master-0 kubenswrapper[4102]: I0312 12:20:36.730086 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 12:20:36.730603 master-0 kubenswrapper[4102]: I0312 12:20:36.730564 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 12:20:36.751052 master-0 kubenswrapper[4102]: I0312 12:20:36.750872 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.751052 master-0 kubenswrapper[4102]: I0312 12:20:36.750946 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.751052 master-0 kubenswrapper[4102]: I0312 12:20:36.750983 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.751316 master-0 kubenswrapper[4102]: I0312 12:20:36.751114 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.853647 master-0 kubenswrapper[4102]: I0312 12:20:36.852174 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.853647 master-0 kubenswrapper[4102]: I0312 12:20:36.852338 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.853647 master-0 kubenswrapper[4102]: I0312 12:20:36.852408 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.853647 master-0 kubenswrapper[4102]: I0312 12:20:36.852456 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.853647 master-0 kubenswrapper[4102]: I0312 12:20:36.853579 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.854427 master-0 kubenswrapper[4102]: I0312 12:20:36.854390 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.866309 master-0 kubenswrapper[4102]: I0312 12:20:36.858728 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.870129 master-0 kubenswrapper[4102]: I0312 12:20:36.870064 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:36.910949 master-0 kubenswrapper[4102]: I0312 12:20:36.910128 4102 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="9b70d60ef3db50f988d7297005c87ae9142093113f8ee25c0d2a4d1f3023050e" exitCode=0 Mar 12 12:20:36.910949 master-0 kubenswrapper[4102]: I0312 12:20:36.910178 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerDied","Data":"9b70d60ef3db50f988d7297005c87ae9142093113f8ee25c0d2a4d1f3023050e"} Mar 12 12:20:37.044891 master-0 kubenswrapper[4102]: I0312 12:20:37.044771 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:20:37.356456 master-0 kubenswrapper[4102]: I0312 12:20:37.356332 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:37.356698 master-0 kubenswrapper[4102]: E0312 12:20:37.356634 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:20:37.356698 master-0 kubenswrapper[4102]: E0312 12:20:37.356698 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:20:37.356790 master-0 kubenswrapper[4102]: E0312 12:20:37.356718 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:37.356823 master-0 kubenswrapper[4102]: E0312 12:20:37.356801 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:41.356776736 +0000 UTC m=+86.289553151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:37.841099 master-0 kubenswrapper[4102]: I0312 12:20:37.839428 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:37.841099 master-0 kubenswrapper[4102]: I0312 12:20:37.839706 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:37.841099 master-0 kubenswrapper[4102]: E0312 12:20:37.839833 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:37.841099 master-0 kubenswrapper[4102]: E0312 12:20:37.839949 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:37.914810 master-0 kubenswrapper[4102]: I0312 12:20:37.914742 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerStarted","Data":"fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b"} Mar 12 12:20:38.921345 master-0 kubenswrapper[4102]: I0312 12:20:38.921277 4102 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="ef1ee7f5f63359043faa6c46c732e5167be16d3211712dbb5e513926f5b91304" exitCode=0 Mar 12 12:20:38.921345 master-0 kubenswrapper[4102]: I0312 12:20:38.921358 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerDied","Data":"ef1ee7f5f63359043faa6c46c732e5167be16d3211712dbb5e513926f5b91304"} Mar 12 12:20:38.924610 master-0 kubenswrapper[4102]: I0312 12:20:38.924552 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-hb48g" event={"ID":"666857a1-0ddf-4b48-91f4-44cce154d1b1","Type":"ContainerStarted","Data":"a0a6d983468bc5666da5a25f46a0ac8a483d82270c70b0e028471f831a83eca6"} Mar 12 12:20:39.839797 master-0 kubenswrapper[4102]: I0312 12:20:39.839738 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:39.839960 master-0 kubenswrapper[4102]: I0312 12:20:39.839777 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:39.840045 master-0 kubenswrapper[4102]: E0312 12:20:39.839975 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:39.840086 master-0 kubenswrapper[4102]: E0312 12:20:39.840049 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:39.930084 master-0 kubenswrapper[4102]: I0312 12:20:39.929920 4102 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="ea68604b446ee8cffe24318b7151377c7b04f157b8ea561b83368baecd127158" exitCode=0 Mar 12 12:20:39.930084 master-0 kubenswrapper[4102]: I0312 12:20:39.929979 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerDied","Data":"ea68604b446ee8cffe24318b7151377c7b04f157b8ea561b83368baecd127158"} Mar 12 12:20:39.951223 master-0 kubenswrapper[4102]: I0312 12:20:39.951130 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-hb48g" podStartSLOduration=3.311039369 podStartE2EDuration="22.951109642s" podCreationTimestamp="2026-03-12 12:20:17 +0000 UTC" firstStartedPulling="2026-03-12 12:20:18.120802856 +0000 UTC m=+63.053579311" lastFinishedPulling="2026-03-12 12:20:37.760873179 +0000 UTC m=+82.693649584" observedRunningTime="2026-03-12 12:20:38.958723509 +0000 UTC m=+83.891499944" watchObservedRunningTime="2026-03-12 12:20:39.951109642 +0000 UTC m=+84.883886057" Mar 12 12:20:41.399025 master-0 kubenswrapper[4102]: I0312 12:20:41.398957 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:41.399716 master-0 kubenswrapper[4102]: E0312 12:20:41.399287 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:20:41.399716 master-0 kubenswrapper[4102]: E0312 12:20:41.399368 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:20:41.399716 master-0 kubenswrapper[4102]: E0312 12:20:41.399391 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:41.399716 master-0 kubenswrapper[4102]: E0312 12:20:41.399606 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:20:49.399522221 +0000 UTC m=+94.332298646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:41.838973 master-0 kubenswrapper[4102]: I0312 12:20:41.838910 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:41.839166 master-0 kubenswrapper[4102]: I0312 12:20:41.838967 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:41.839166 master-0 kubenswrapper[4102]: E0312 12:20:41.839080 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:41.839242 master-0 kubenswrapper[4102]: E0312 12:20:41.839210 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:42.852268 master-0 kubenswrapper[4102]: I0312 12:20:42.852236 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 12 12:20:43.839702 master-0 kubenswrapper[4102]: I0312 12:20:43.839655 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:43.839898 master-0 kubenswrapper[4102]: E0312 12:20:43.839766 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:43.840138 master-0 kubenswrapper[4102]: I0312 12:20:43.840123 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:43.840196 master-0 kubenswrapper[4102]: E0312 12:20:43.840179 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:45.838895 master-0 kubenswrapper[4102]: I0312 12:20:45.838850 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:45.838895 master-0 kubenswrapper[4102]: I0312 12:20:45.838867 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:45.842060 master-0 kubenswrapper[4102]: E0312 12:20:45.842022 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:45.842368 master-0 kubenswrapper[4102]: E0312 12:20:45.842337 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:46.198394 master-0 kubenswrapper[4102]: I0312 12:20:46.198311 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=4.198295039 podStartE2EDuration="4.198295039s" podCreationTimestamp="2026-03-12 12:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:20:46.197970011 +0000 UTC m=+91.130746426" watchObservedRunningTime="2026-03-12 12:20:46.198295039 +0000 UTC m=+91.131071454" Mar 12 12:20:47.839725 master-0 kubenswrapper[4102]: I0312 12:20:47.839643 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:47.840548 master-0 kubenswrapper[4102]: I0312 12:20:47.839757 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:47.840548 master-0 kubenswrapper[4102]: E0312 12:20:47.840244 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:47.840548 master-0 kubenswrapper[4102]: E0312 12:20:47.840439 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:48.340144 master-0 kubenswrapper[4102]: I0312 12:20:48.340041 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 12 12:20:49.459537 master-0 kubenswrapper[4102]: I0312 12:20:49.459492 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:49.460037 master-0 kubenswrapper[4102]: E0312 12:20:49.459615 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:20:49.460037 master-0 kubenswrapper[4102]: E0312 12:20:49.459644 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:20:49.460037 master-0 kubenswrapper[4102]: E0312 12:20:49.459656 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:49.460037 master-0 kubenswrapper[4102]: E0312 12:20:49.459709 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:05.459694342 +0000 UTC m=+110.392470757 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:20:49.839545 master-0 kubenswrapper[4102]: I0312 12:20:49.839425 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:49.839695 master-0 kubenswrapper[4102]: I0312 12:20:49.839465 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:49.839733 master-0 kubenswrapper[4102]: E0312 12:20:49.839691 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:49.839733 master-0 kubenswrapper[4102]: E0312 12:20:49.839712 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:50.869039 master-0 kubenswrapper[4102]: I0312 12:20:50.868891 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:50.869709 master-0 kubenswrapper[4102]: E0312 12:20:50.869076 4102 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:50.869709 master-0 kubenswrapper[4102]: E0312 12:20:50.869172 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.869149758 +0000 UTC m=+127.801926173 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 12 12:20:51.838766 master-0 kubenswrapper[4102]: I0312 12:20:51.838705 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:51.838766 master-0 kubenswrapper[4102]: I0312 12:20:51.838739 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:51.839015 master-0 kubenswrapper[4102]: E0312 12:20:51.838815 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:51.839015 master-0 kubenswrapper[4102]: E0312 12:20:51.838933 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:52.963726 master-0 kubenswrapper[4102]: I0312 12:20:52.963548 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" exitCode=0 Mar 12 12:20:52.963726 master-0 kubenswrapper[4102]: I0312 12:20:52.963615 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} Mar 12 12:20:52.969649 master-0 kubenswrapper[4102]: I0312 12:20:52.969617 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" event={"ID":"2f3a291a-d9af-4e0f-a307-8928e4dc523d","Type":"ContainerStarted","Data":"bf8bddde515df885e66ccd95c414dbc03cb4e50f31ec3dd72464e9bd5b698d49"} Mar 12 12:20:52.973125 master-0 kubenswrapper[4102]: I0312 12:20:52.973094 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerStarted","Data":"2c2b5cd50e4b41a7c3aafd02e56e622ce6b2150721ba8e3603b831c988a04475"} Mar 12 12:20:52.973125 master-0 kubenswrapper[4102]: I0312 12:20:52.973126 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerStarted","Data":"e03c733924b0eea3b5b79909052986f89b22c571dfd7876833ef238212c99f5b"} Mar 12 12:20:52.976430 master-0 kubenswrapper[4102]: I0312 12:20:52.976382 4102 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="fb3151f0498b4d271395613cdb5a66c2bdbd18c371f71b100988d9a1524ba2df" exitCode=0 Mar 12 12:20:52.976430 master-0 kubenswrapper[4102]: I0312 12:20:52.976411 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerDied","Data":"fb3151f0498b4d271395613cdb5a66c2bdbd18c371f71b100988d9a1524ba2df"} Mar 12 12:20:53.015113 master-0 kubenswrapper[4102]: I0312 12:20:53.011458 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=6.011442721 podStartE2EDuration="6.011442721s" podCreationTimestamp="2026-03-12 12:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:20:52.979355341 +0000 UTC m=+97.912131746" watchObservedRunningTime="2026-03-12 12:20:53.011442721 +0000 UTC m=+97.944219136" Mar 12 12:20:53.029647 master-0 kubenswrapper[4102]: I0312 12:20:53.029140 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-rzmhl" podStartSLOduration=2.183439055 podStartE2EDuration="17.029116735s" podCreationTimestamp="2026-03-12 12:20:36 +0000 UTC" firstStartedPulling="2026-03-12 12:20:37.700133112 +0000 UTC m=+82.632909547" lastFinishedPulling="2026-03-12 12:20:52.545810772 +0000 UTC m=+97.478587227" observedRunningTime="2026-03-12 12:20:53.028833538 +0000 UTC m=+97.961609953" watchObservedRunningTime="2026-03-12 12:20:53.029116735 +0000 UTC m=+97.961893180" Mar 12 12:20:53.839997 master-0 kubenswrapper[4102]: I0312 12:20:53.839501 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:53.840267 master-0 kubenswrapper[4102]: E0312 12:20:53.840143 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:53.840921 master-0 kubenswrapper[4102]: I0312 12:20:53.840634 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:53.840921 master-0 kubenswrapper[4102]: E0312 12:20:53.840894 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:53.983239 master-0 kubenswrapper[4102]: I0312 12:20:53.983156 4102 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="609bc41bd5850647fabc2e01d12345f9b41d6cc4ea84bcb7679ae4b6d13d442e" exitCode=0 Mar 12 12:20:53.984984 master-0 kubenswrapper[4102]: I0312 12:20:53.983243 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerDied","Data":"609bc41bd5850647fabc2e01d12345f9b41d6cc4ea84bcb7679ae4b6d13d442e"} Mar 12 12:20:53.987126 master-0 kubenswrapper[4102]: I0312 12:20:53.987074 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} Mar 12 12:20:53.987206 master-0 kubenswrapper[4102]: I0312 12:20:53.987127 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} Mar 12 12:20:53.987206 master-0 kubenswrapper[4102]: I0312 12:20:53.987139 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} Mar 12 12:20:53.987206 master-0 kubenswrapper[4102]: I0312 12:20:53.987152 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} Mar 12 12:20:53.987206 master-0 kubenswrapper[4102]: I0312 12:20:53.987161 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} Mar 12 12:20:53.987206 master-0 kubenswrapper[4102]: I0312 12:20:53.987172 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} Mar 12 12:20:54.036009 master-0 kubenswrapper[4102]: I0312 12:20:54.035731 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" podStartSLOduration=2.920842737 podStartE2EDuration="25.035712218s" podCreationTimestamp="2026-03-12 12:20:29 +0000 UTC" firstStartedPulling="2026-03-12 12:20:30.41628106 +0000 UTC m=+75.349057475" lastFinishedPulling="2026-03-12 12:20:52.531150541 +0000 UTC m=+97.463926956" observedRunningTime="2026-03-12 12:20:53.072074965 +0000 UTC m=+98.004851420" watchObservedRunningTime="2026-03-12 12:20:54.035712218 +0000 UTC m=+98.968488633" Mar 12 12:20:54.997096 master-0 kubenswrapper[4102]: I0312 12:20:54.996985 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-r86hc" event={"ID":"10498208-0692-4533-b672-a7a2cfcdf1be","Type":"ContainerStarted","Data":"733accb0a94e97490e093b839e0d3206ec03c5fdecabddc4e6c31fe3665589a4"} Mar 12 12:20:55.838683 master-0 kubenswrapper[4102]: I0312 12:20:55.838622 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:55.838852 master-0 kubenswrapper[4102]: I0312 12:20:55.838698 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:55.848241 master-0 kubenswrapper[4102]: E0312 12:20:55.848143 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:55.848447 master-0 kubenswrapper[4102]: E0312 12:20:55.848296 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:56.478406 master-0 kubenswrapper[4102]: I0312 12:20:56.478303 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-r86hc" podStartSLOduration=5.348435414 podStartE2EDuration="39.478279921s" podCreationTimestamp="2026-03-12 12:20:17 +0000 UTC" firstStartedPulling="2026-03-12 12:20:18.327197306 +0000 UTC m=+63.259973721" lastFinishedPulling="2026-03-12 12:20:52.457041793 +0000 UTC m=+97.389818228" observedRunningTime="2026-03-12 12:20:55.023896979 +0000 UTC m=+99.956673454" watchObservedRunningTime="2026-03-12 12:20:56.478279921 +0000 UTC m=+101.411056366" Mar 12 12:20:56.479538 master-0 kubenswrapper[4102]: I0312 12:20:56.479499 4102 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsbxb"] Mar 12 12:20:57.009644 master-0 kubenswrapper[4102]: I0312 12:20:57.009573 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} Mar 12 12:20:57.839427 master-0 kubenswrapper[4102]: I0312 12:20:57.839333 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:57.840232 master-0 kubenswrapper[4102]: E0312 12:20:57.839517 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:57.840232 master-0 kubenswrapper[4102]: I0312 12:20:57.839576 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:57.840232 master-0 kubenswrapper[4102]: E0312 12:20:57.839753 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:59.024138 master-0 kubenswrapper[4102]: I0312 12:20:59.024029 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerStarted","Data":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024425 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-controller" containerID="cri-o://317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024618 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024645 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="nbdb" containerID="cri-o://3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024697 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-node" containerID="cri-o://f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024789 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-acl-logging" containerID="cri-o://2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024797 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="northd" containerID="cri-o://c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.024443 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="sbdb" containerID="cri-o://abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" gracePeriod=30 Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.025006 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.025038 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:59.025934 master-0 kubenswrapper[4102]: I0312 12:20:59.025052 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:59.029790 master-0 kubenswrapper[4102]: E0312 12:20:59.029516 4102 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 12 12:20:59.031854 master-0 kubenswrapper[4102]: E0312 12:20:59.031787 4102 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 12 12:20:59.034066 master-0 kubenswrapper[4102]: E0312 12:20:59.033103 4102 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 12 12:20:59.037093 master-0 kubenswrapper[4102]: E0312 12:20:59.037015 4102 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"sb\"\n"] Mar 12 12:20:59.037093 master-0 kubenswrapper[4102]: E0312 12:20:59.037080 4102 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="sbdb" Mar 12 12:20:59.037279 master-0 kubenswrapper[4102]: E0312 12:20:59.037205 4102 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 12 12:20:59.042960 master-0 kubenswrapper[4102]: E0312 12:20:59.041566 4102 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" cmd=["/bin/bash","-c","set -xeo pipefail\n. /ovnkube-lib/ovnkube-lib.sh || exit 1\novndb-readiness-probe \"nb\"\n"] Mar 12 12:20:59.042960 master-0 kubenswrapper[4102]: E0312 12:20:59.041652 4102 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="nbdb" Mar 12 12:20:59.062392 master-0 kubenswrapper[4102]: I0312 12:20:59.061964 4102 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovnkube-controller" containerID="cri-o://9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" gracePeriod=30 Mar 12 12:20:59.743212 master-0 kubenswrapper[4102]: I0312 12:20:59.743150 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/ovnkube-controller/0.log" Mar 12 12:20:59.744763 master-0 kubenswrapper[4102]: I0312 12:20:59.744722 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/kube-rbac-proxy-ovn-metrics/0.log" Mar 12 12:20:59.745506 master-0 kubenswrapper[4102]: I0312 12:20:59.745437 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/kube-rbac-proxy-node/0.log" Mar 12 12:20:59.745986 master-0 kubenswrapper[4102]: I0312 12:20:59.745951 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/ovn-acl-logging/0.log" Mar 12 12:20:59.746816 master-0 kubenswrapper[4102]: I0312 12:20:59.746793 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/ovn-controller/0.log" Mar 12 12:20:59.747592 master-0 kubenswrapper[4102]: I0312 12:20:59.747553 4102 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:20:59.797195 master-0 kubenswrapper[4102]: I0312 12:20:59.797111 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-l5d2w"] Mar 12 12:20:59.797445 master-0 kubenswrapper[4102]: E0312 12:20:59.797404 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="nbdb" Mar 12 12:20:59.797543 master-0 kubenswrapper[4102]: I0312 12:20:59.797446 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="nbdb" Mar 12 12:20:59.797543 master-0 kubenswrapper[4102]: E0312 12:20:59.797469 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-node" Mar 12 12:20:59.797543 master-0 kubenswrapper[4102]: I0312 12:20:59.797513 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-node" Mar 12 12:20:59.797543 master-0 kubenswrapper[4102]: E0312 12:20:59.797532 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-controller" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: I0312 12:20:59.797549 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-controller" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: E0312 12:20:59.797570 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: I0312 12:20:59.797587 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: E0312 12:20:59.797607 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovnkube-controller" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: I0312 12:20:59.797623 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovnkube-controller" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: E0312 12:20:59.797644 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kubecfg-setup" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: I0312 12:20:59.797660 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kubecfg-setup" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: E0312 12:20:59.797678 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="sbdb" Mar 12 12:20:59.797691 master-0 kubenswrapper[4102]: I0312 12:20:59.797694 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="sbdb" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: E0312 12:20:59.797713 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-acl-logging" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797732 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-acl-logging" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: E0312 12:20:59.797750 4102 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="northd" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797768 4102 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="northd" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797854 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-acl-logging" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797875 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-node" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797895 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovnkube-controller" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797913 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="northd" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797928 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="nbdb" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797945 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="ovn-controller" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797962 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="kube-rbac-proxy-ovn-metrics" Mar 12 12:20:59.798166 master-0 kubenswrapper[4102]: I0312 12:20:59.797978 4102 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerName="sbdb" Mar 12 12:20:59.799693 master-0 kubenswrapper[4102]: I0312 12:20:59.799649 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.842141 master-0 kubenswrapper[4102]: I0312 12:20:59.841301 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:20:59.842141 master-0 kubenswrapper[4102]: E0312 12:20:59.841410 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:20:59.842141 master-0 kubenswrapper[4102]: I0312 12:20:59.841767 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:20:59.842141 master-0 kubenswrapper[4102]: E0312 12:20:59.841816 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:20:59.856848 master-0 kubenswrapper[4102]: I0312 12:20:59.856788 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-systemd-units\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.856997 master-0 kubenswrapper[4102]: I0312 12:20:59.856863 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.856997 master-0 kubenswrapper[4102]: I0312 12:20:59.856925 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-var-lib-openvswitch\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857064 master-0 kubenswrapper[4102]: I0312 12:20:59.856996 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.857064 master-0 kubenswrapper[4102]: I0312 12:20:59.857035 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-openvswitch\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857146 master-0 kubenswrapper[4102]: I0312 12:20:59.857083 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-kubelet\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857146 master-0 kubenswrapper[4102]: I0312 12:20:59.857110 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.857146 master-0 kubenswrapper[4102]: I0312 12:20:59.857109 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.857346 master-0 kubenswrapper[4102]: I0312 12:20:59.857279 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857417 master-0 kubenswrapper[4102]: I0312 12:20:59.857368 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.857636 master-0 kubenswrapper[4102]: I0312 12:20:59.857555 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-bin\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857723 master-0 kubenswrapper[4102]: I0312 12:20:59.857657 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-slash\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857723 master-0 kubenswrapper[4102]: I0312 12:20:59.857661 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.857816 master-0 kubenswrapper[4102]: I0312 12:20:59.857695 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-slash" (OuterVolumeSpecName: "host-slash") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.857816 master-0 kubenswrapper[4102]: I0312 12:20:59.857723 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-config\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857816 master-0 kubenswrapper[4102]: I0312 12:20:59.857769 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-netns\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857816 master-0 kubenswrapper[4102]: I0312 12:20:59.857800 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-systemd\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.857816 master-0 kubenswrapper[4102]: I0312 12:20:59.857809 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.857840 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfmp5\" (UniqueName: \"kubernetes.io/projected/b7b07b83-cade-4b52-b568-5c7e9d7c496b-kube-api-access-mfmp5\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.857877 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-netd\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.857910 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-node-log\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.857939 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-ovn\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.857975 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovn-node-metrics-cert\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858005 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-etc-openvswitch\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858038 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-env-overrides\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858045 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-node-log" (OuterVolumeSpecName: "node-log") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858071 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-script-lib\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858101 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858104 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-ovn-kubernetes\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858084 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858104 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858188 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-log-socket" (OuterVolumeSpecName: "log-socket") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858137 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858157 4102 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-log-socket\") pod \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\" (UID: \"b7b07b83-cade-4b52-b568-5c7e9d7c496b\") " Mar 12 12:20:59.858607 master-0 kubenswrapper[4102]: I0312 12:20:59.858461 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858574 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858569 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858603 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858639 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858712 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858734 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858767 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858800 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858856 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858909 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859006 master-0 kubenswrapper[4102]: I0312 12:20:59.858957 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859370 master-0 kubenswrapper[4102]: I0312 12:20:59.859007 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859370 master-0 kubenswrapper[4102]: I0312 12:20:59.859097 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859370 master-0 kubenswrapper[4102]: I0312 12:20:59.859216 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859370 master-0 kubenswrapper[4102]: I0312 12:20:59.859267 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859370 master-0 kubenswrapper[4102]: I0312 12:20:59.859317 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859508 master-0 kubenswrapper[4102]: I0312 12:20:59.859370 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859508 master-0 kubenswrapper[4102]: I0312 12:20:59.859418 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859624 master-0 kubenswrapper[4102]: I0312 12:20:59.859581 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859675 master-0 kubenswrapper[4102]: I0312 12:20:59.859646 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859737 master-0 kubenswrapper[4102]: I0312 12:20:59.859710 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859793 master-0 kubenswrapper[4102]: I0312 12:20:59.859765 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.859895 master-0 kubenswrapper[4102]: I0312 12:20:59.859862 4102 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.859930 master-0 kubenswrapper[4102]: I0312 12:20:59.859903 4102 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.859956 master-0 kubenswrapper[4102]: I0312 12:20:59.859928 4102 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.859984 master-0 kubenswrapper[4102]: I0312 12:20:59.859949 4102 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860010 master-0 kubenswrapper[4102]: I0312 12:20:59.859997 4102 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860035 master-0 kubenswrapper[4102]: I0312 12:20:59.860018 4102 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860063 master-0 kubenswrapper[4102]: I0312 12:20:59.860039 4102 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860063 master-0 kubenswrapper[4102]: I0312 12:20:59.860058 4102 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860109 master-0 kubenswrapper[4102]: I0312 12:20:59.860076 4102 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860109 master-0 kubenswrapper[4102]: I0312 12:20:59.860097 4102 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860155 master-0 kubenswrapper[4102]: I0312 12:20:59.860118 4102 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860155 master-0 kubenswrapper[4102]: I0312 12:20:59.860139 4102 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860204 master-0 kubenswrapper[4102]: I0312 12:20:59.860158 4102 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860204 master-0 kubenswrapper[4102]: I0312 12:20:59.860179 4102 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860204 master-0 kubenswrapper[4102]: I0312 12:20:59.860197 4102 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860284 master-0 kubenswrapper[4102]: I0312 12:20:59.860216 4102 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-node-log\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.860284 master-0 kubenswrapper[4102]: I0312 12:20:59.860236 4102 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.864999 master-0 kubenswrapper[4102]: I0312 12:20:59.864966 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7b07b83-cade-4b52-b568-5c7e9d7c496b-kube-api-access-mfmp5" (OuterVolumeSpecName: "kube-api-access-mfmp5") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "kube-api-access-mfmp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:20:59.866786 master-0 kubenswrapper[4102]: I0312 12:20:59.865739 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:20:59.868331 master-0 kubenswrapper[4102]: I0312 12:20:59.868293 4102 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b7b07b83-cade-4b52-b568-5c7e9d7c496b" (UID: "b7b07b83-cade-4b52-b568-5c7e9d7c496b"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:20:59.961618 master-0 kubenswrapper[4102]: I0312 12:20:59.961540 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961844 master-0 kubenswrapper[4102]: I0312 12:20:59.961727 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961844 master-0 kubenswrapper[4102]: I0312 12:20:59.961785 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961844 master-0 kubenswrapper[4102]: I0312 12:20:59.961833 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961844 master-0 kubenswrapper[4102]: I0312 12:20:59.961838 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961994 master-0 kubenswrapper[4102]: I0312 12:20:59.961911 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961994 master-0 kubenswrapper[4102]: I0312 12:20:59.961944 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.961994 master-0 kubenswrapper[4102]: I0312 12:20:59.961981 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962169 master-0 kubenswrapper[4102]: I0312 12:20:59.962137 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962169 master-0 kubenswrapper[4102]: I0312 12:20:59.962157 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962246 master-0 kubenswrapper[4102]: I0312 12:20:59.962201 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962246 master-0 kubenswrapper[4102]: I0312 12:20:59.962233 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962329 master-0 kubenswrapper[4102]: I0312 12:20:59.962272 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962329 master-0 kubenswrapper[4102]: I0312 12:20:59.962284 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962329 master-0 kubenswrapper[4102]: I0312 12:20:59.962312 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962329 master-0 kubenswrapper[4102]: I0312 12:20:59.962320 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962515 master-0 kubenswrapper[4102]: I0312 12:20:59.962346 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962515 master-0 kubenswrapper[4102]: I0312 12:20:59.962360 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962515 master-0 kubenswrapper[4102]: I0312 12:20:59.962413 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962515 master-0 kubenswrapper[4102]: I0312 12:20:59.962457 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962515 master-0 kubenswrapper[4102]: I0312 12:20:59.962502 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962701 master-0 kubenswrapper[4102]: I0312 12:20:59.962554 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962701 master-0 kubenswrapper[4102]: I0312 12:20:59.962609 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962701 master-0 kubenswrapper[4102]: I0312 12:20:59.962644 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962701 master-0 kubenswrapper[4102]: I0312 12:20:59.962677 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962854 master-0 kubenswrapper[4102]: I0312 12:20:59.962701 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962854 master-0 kubenswrapper[4102]: I0312 12:20:59.962713 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962854 master-0 kubenswrapper[4102]: I0312 12:20:59.962750 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.962854 master-0 kubenswrapper[4102]: I0312 12:20:59.962770 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963008 master-0 kubenswrapper[4102]: I0312 12:20:59.962850 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963008 master-0 kubenswrapper[4102]: I0312 12:20:59.962899 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963008 master-0 kubenswrapper[4102]: I0312 12:20:59.963003 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963153 master-0 kubenswrapper[4102]: I0312 12:20:59.963046 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963153 master-0 kubenswrapper[4102]: I0312 12:20:59.963068 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963153 master-0 kubenswrapper[4102]: I0312 12:20:59.963145 4102 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b7b07b83-cade-4b52-b568-5c7e9d7c496b-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.963263 master-0 kubenswrapper[4102]: I0312 12:20:59.963171 4102 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b7b07b83-cade-4b52-b568-5c7e9d7c496b-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.963263 master-0 kubenswrapper[4102]: I0312 12:20:59.963184 4102 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfmp5\" (UniqueName: \"kubernetes.io/projected/b7b07b83-cade-4b52-b568-5c7e9d7c496b-kube-api-access-mfmp5\") on node \"master-0\" DevicePath \"\"" Mar 12 12:20:59.963263 master-0 kubenswrapper[4102]: I0312 12:20:59.963215 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963263 master-0 kubenswrapper[4102]: I0312 12:20:59.963240 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963520 master-0 kubenswrapper[4102]: I0312 12:20:59.963462 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.963869 master-0 kubenswrapper[4102]: I0312 12:20:59.963823 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.966075 master-0 kubenswrapper[4102]: I0312 12:20:59.966029 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:20:59.978065 master-0 kubenswrapper[4102]: I0312 12:20:59.978025 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:00.029589 master-0 kubenswrapper[4102]: I0312 12:21:00.029526 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/ovnkube-controller/0.log" Mar 12 12:21:00.031896 master-0 kubenswrapper[4102]: I0312 12:21:00.031860 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/kube-rbac-proxy-ovn-metrics/0.log" Mar 12 12:21:00.032896 master-0 kubenswrapper[4102]: I0312 12:21:00.032856 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/kube-rbac-proxy-node/0.log" Mar 12 12:21:00.033587 master-0 kubenswrapper[4102]: I0312 12:21:00.033559 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/ovn-acl-logging/0.log" Mar 12 12:21:00.034230 master-0 kubenswrapper[4102]: I0312 12:21:00.034203 4102 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xsbxb_b7b07b83-cade-4b52-b568-5c7e9d7c496b/ovn-controller/0.log" Mar 12 12:21:00.034945 master-0 kubenswrapper[4102]: I0312 12:21:00.034906 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" exitCode=1 Mar 12 12:21:00.034945 master-0 kubenswrapper[4102]: I0312 12:21:00.034944 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" exitCode=0 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.034959 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" exitCode=0 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.034970 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" exitCode=0 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.034983 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" exitCode=143 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.034996 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" exitCode=143 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.035007 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" exitCode=143 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.035018 4102 generic.go:334] "Generic (PLEG): container finished" podID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" exitCode=143 Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.035036 4102 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.035050 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.035104 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} Mar 12 12:21:00.035132 master-0 kubenswrapper[4102]: I0312 12:21:00.035149 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035165 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035180 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035196 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035260 4102 scope.go:117] "RemoveContainer" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035215 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035333 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035341 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035352 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035366 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035375 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035383 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035391 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035398 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035406 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035416 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035424 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035431 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035442 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035453 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035463 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035472 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035502 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035510 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035518 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035526 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035534 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} Mar 12 12:21:00.036187 master-0 kubenswrapper[4102]: I0312 12:21:00.035541 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035553 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xsbxb" event={"ID":"b7b07b83-cade-4b52-b568-5c7e9d7c496b","Type":"ContainerDied","Data":"b741cbd42bcb7d841acb6d1567fee58839a8dcf67f2d4afb5121784d68ade860"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035564 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035574 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035582 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035591 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035599 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035606 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035614 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035623 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} Mar 12 12:21:00.038515 master-0 kubenswrapper[4102]: I0312 12:21:00.035631 4102 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} Mar 12 12:21:00.050634 master-0 kubenswrapper[4102]: I0312 12:21:00.050574 4102 scope.go:117] "RemoveContainer" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.060331 master-0 kubenswrapper[4102]: I0312 12:21:00.059871 4102 scope.go:117] "RemoveContainer" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.073897 master-0 kubenswrapper[4102]: I0312 12:21:00.073790 4102 scope.go:117] "RemoveContainer" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.078605 master-0 kubenswrapper[4102]: I0312 12:21:00.077326 4102 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsbxb"] Mar 12 12:21:00.084262 master-0 kubenswrapper[4102]: I0312 12:21:00.083346 4102 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xsbxb"] Mar 12 12:21:00.088705 master-0 kubenswrapper[4102]: I0312 12:21:00.088584 4102 scope.go:117] "RemoveContainer" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.098213 master-0 kubenswrapper[4102]: I0312 12:21:00.098092 4102 scope.go:117] "RemoveContainer" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.106592 master-0 kubenswrapper[4102]: I0312 12:21:00.106522 4102 scope.go:117] "RemoveContainer" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" Mar 12 12:21:00.119562 master-0 kubenswrapper[4102]: I0312 12:21:00.118701 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:00.119895 master-0 kubenswrapper[4102]: I0312 12:21:00.119789 4102 scope.go:117] "RemoveContainer" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" Mar 12 12:21:00.132307 master-0 kubenswrapper[4102]: I0312 12:21:00.132249 4102 scope.go:117] "RemoveContainer" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" Mar 12 12:21:00.136744 master-0 kubenswrapper[4102]: W0312 12:21:00.136699 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf04121eb_5c7b_42cd_a2e2_26cf1c67593d.slice/crio-18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a WatchSource:0}: Error finding container 18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a: Status 404 returned error can't find the container with id 18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a Mar 12 12:21:00.140918 master-0 kubenswrapper[4102]: I0312 12:21:00.140857 4102 scope.go:117] "RemoveContainer" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.141689 master-0 kubenswrapper[4102]: E0312 12:21:00.141626 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": container with ID starting with 9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04 not found: ID does not exist" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.141689 master-0 kubenswrapper[4102]: I0312 12:21:00.141684 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} err="failed to get container status \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": rpc error: code = NotFound desc = could not find container \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": container with ID starting with 9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04 not found: ID does not exist" Mar 12 12:21:00.141689 master-0 kubenswrapper[4102]: I0312 12:21:00.141722 4102 scope.go:117] "RemoveContainer" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.142093 master-0 kubenswrapper[4102]: E0312 12:21:00.142041 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": container with ID starting with abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32 not found: ID does not exist" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.142093 master-0 kubenswrapper[4102]: I0312 12:21:00.142079 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} err="failed to get container status \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": rpc error: code = NotFound desc = could not find container \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": container with ID starting with abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32 not found: ID does not exist" Mar 12 12:21:00.142223 master-0 kubenswrapper[4102]: I0312 12:21:00.142100 4102 scope.go:117] "RemoveContainer" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.142389 master-0 kubenswrapper[4102]: E0312 12:21:00.142358 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": container with ID starting with 3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c not found: ID does not exist" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.142472 master-0 kubenswrapper[4102]: I0312 12:21:00.142387 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} err="failed to get container status \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": rpc error: code = NotFound desc = could not find container \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": container with ID starting with 3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c not found: ID does not exist" Mar 12 12:21:00.142472 master-0 kubenswrapper[4102]: I0312 12:21:00.142407 4102 scope.go:117] "RemoveContainer" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.142966 master-0 kubenswrapper[4102]: E0312 12:21:00.142908 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": container with ID starting with c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07 not found: ID does not exist" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.142966 master-0 kubenswrapper[4102]: I0312 12:21:00.142954 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} err="failed to get container status \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": rpc error: code = NotFound desc = could not find container \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": container with ID starting with c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07 not found: ID does not exist" Mar 12 12:21:00.143144 master-0 kubenswrapper[4102]: I0312 12:21:00.142978 4102 scope.go:117] "RemoveContainer" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.146516 master-0 kubenswrapper[4102]: E0312 12:21:00.145106 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": container with ID starting with 84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30 not found: ID does not exist" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.146516 master-0 kubenswrapper[4102]: I0312 12:21:00.145803 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} err="failed to get container status \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": rpc error: code = NotFound desc = could not find container \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": container with ID starting with 84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30 not found: ID does not exist" Mar 12 12:21:00.146516 master-0 kubenswrapper[4102]: I0312 12:21:00.145836 4102 scope.go:117] "RemoveContainer" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.146516 master-0 kubenswrapper[4102]: E0312 12:21:00.146246 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": container with ID starting with f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049 not found: ID does not exist" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.146516 master-0 kubenswrapper[4102]: I0312 12:21:00.146270 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} err="failed to get container status \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": rpc error: code = NotFound desc = could not find container \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": container with ID starting with f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049 not found: ID does not exist" Mar 12 12:21:00.146516 master-0 kubenswrapper[4102]: I0312 12:21:00.146288 4102 scope.go:117] "RemoveContainer" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" Mar 12 12:21:00.146917 master-0 kubenswrapper[4102]: E0312 12:21:00.146883 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": container with ID starting with 2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff not found: ID does not exist" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" Mar 12 12:21:00.146952 master-0 kubenswrapper[4102]: I0312 12:21:00.146912 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} err="failed to get container status \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": rpc error: code = NotFound desc = could not find container \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": container with ID starting with 2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff not found: ID does not exist" Mar 12 12:21:00.146952 master-0 kubenswrapper[4102]: I0312 12:21:00.146931 4102 scope.go:117] "RemoveContainer" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" Mar 12 12:21:00.147278 master-0 kubenswrapper[4102]: E0312 12:21:00.147235 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": container with ID starting with 317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28 not found: ID does not exist" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" Mar 12 12:21:00.147313 master-0 kubenswrapper[4102]: I0312 12:21:00.147272 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} err="failed to get container status \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": rpc error: code = NotFound desc = could not find container \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": container with ID starting with 317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28 not found: ID does not exist" Mar 12 12:21:00.147313 master-0 kubenswrapper[4102]: I0312 12:21:00.147303 4102 scope.go:117] "RemoveContainer" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" Mar 12 12:21:00.147614 master-0 kubenswrapper[4102]: E0312 12:21:00.147590 4102 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": container with ID starting with 07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a not found: ID does not exist" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" Mar 12 12:21:00.147670 master-0 kubenswrapper[4102]: I0312 12:21:00.147615 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} err="failed to get container status \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": rpc error: code = NotFound desc = could not find container \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": container with ID starting with 07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a not found: ID does not exist" Mar 12 12:21:00.147670 master-0 kubenswrapper[4102]: I0312 12:21:00.147630 4102 scope.go:117] "RemoveContainer" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.147908 master-0 kubenswrapper[4102]: I0312 12:21:00.147879 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} err="failed to get container status \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": rpc error: code = NotFound desc = could not find container \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": container with ID starting with 9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04 not found: ID does not exist" Mar 12 12:21:00.147908 master-0 kubenswrapper[4102]: I0312 12:21:00.147902 4102 scope.go:117] "RemoveContainer" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.150079 master-0 kubenswrapper[4102]: I0312 12:21:00.150028 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} err="failed to get container status \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": rpc error: code = NotFound desc = could not find container \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": container with ID starting with abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32 not found: ID does not exist" Mar 12 12:21:00.150079 master-0 kubenswrapper[4102]: I0312 12:21:00.150068 4102 scope.go:117] "RemoveContainer" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.150612 master-0 kubenswrapper[4102]: I0312 12:21:00.150446 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} err="failed to get container status \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": rpc error: code = NotFound desc = could not find container \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": container with ID starting with 3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c not found: ID does not exist" Mar 12 12:21:00.150612 master-0 kubenswrapper[4102]: I0312 12:21:00.150528 4102 scope.go:117] "RemoveContainer" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.150968 master-0 kubenswrapper[4102]: I0312 12:21:00.150929 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} err="failed to get container status \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": rpc error: code = NotFound desc = could not find container \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": container with ID starting with c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07 not found: ID does not exist" Mar 12 12:21:00.150968 master-0 kubenswrapper[4102]: I0312 12:21:00.150956 4102 scope.go:117] "RemoveContainer" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.151234 master-0 kubenswrapper[4102]: I0312 12:21:00.151208 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} err="failed to get container status \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": rpc error: code = NotFound desc = could not find container \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": container with ID starting with 84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30 not found: ID does not exist" Mar 12 12:21:00.151234 master-0 kubenswrapper[4102]: I0312 12:21:00.151229 4102 scope.go:117] "RemoveContainer" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.151496 master-0 kubenswrapper[4102]: I0312 12:21:00.151454 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} err="failed to get container status \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": rpc error: code = NotFound desc = could not find container \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": container with ID starting with f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049 not found: ID does not exist" Mar 12 12:21:00.151543 master-0 kubenswrapper[4102]: I0312 12:21:00.151504 4102 scope.go:117] "RemoveContainer" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" Mar 12 12:21:00.151859 master-0 kubenswrapper[4102]: I0312 12:21:00.151814 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} err="failed to get container status \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": rpc error: code = NotFound desc = could not find container \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": container with ID starting with 2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff not found: ID does not exist" Mar 12 12:21:00.151916 master-0 kubenswrapper[4102]: I0312 12:21:00.151856 4102 scope.go:117] "RemoveContainer" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" Mar 12 12:21:00.152171 master-0 kubenswrapper[4102]: I0312 12:21:00.152146 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} err="failed to get container status \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": rpc error: code = NotFound desc = could not find container \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": container with ID starting with 317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28 not found: ID does not exist" Mar 12 12:21:00.152171 master-0 kubenswrapper[4102]: I0312 12:21:00.152168 4102 scope.go:117] "RemoveContainer" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" Mar 12 12:21:00.152461 master-0 kubenswrapper[4102]: I0312 12:21:00.152435 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} err="failed to get container status \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": rpc error: code = NotFound desc = could not find container \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": container with ID starting with 07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a not found: ID does not exist" Mar 12 12:21:00.152461 master-0 kubenswrapper[4102]: I0312 12:21:00.152457 4102 scope.go:117] "RemoveContainer" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.153343 master-0 kubenswrapper[4102]: I0312 12:21:00.153309 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} err="failed to get container status \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": rpc error: code = NotFound desc = could not find container \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": container with ID starting with 9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04 not found: ID does not exist" Mar 12 12:21:00.153343 master-0 kubenswrapper[4102]: I0312 12:21:00.153331 4102 scope.go:117] "RemoveContainer" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.153723 master-0 kubenswrapper[4102]: I0312 12:21:00.153671 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} err="failed to get container status \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": rpc error: code = NotFound desc = could not find container \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": container with ID starting with abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32 not found: ID does not exist" Mar 12 12:21:00.153723 master-0 kubenswrapper[4102]: I0312 12:21:00.153696 4102 scope.go:117] "RemoveContainer" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.154089 master-0 kubenswrapper[4102]: I0312 12:21:00.154050 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} err="failed to get container status \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": rpc error: code = NotFound desc = could not find container \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": container with ID starting with 3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c not found: ID does not exist" Mar 12 12:21:00.154089 master-0 kubenswrapper[4102]: I0312 12:21:00.154086 4102 scope.go:117] "RemoveContainer" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.154724 master-0 kubenswrapper[4102]: I0312 12:21:00.154697 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} err="failed to get container status \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": rpc error: code = NotFound desc = could not find container \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": container with ID starting with c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07 not found: ID does not exist" Mar 12 12:21:00.154781 master-0 kubenswrapper[4102]: I0312 12:21:00.154727 4102 scope.go:117] "RemoveContainer" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.155067 master-0 kubenswrapper[4102]: I0312 12:21:00.155041 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} err="failed to get container status \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": rpc error: code = NotFound desc = could not find container \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": container with ID starting with 84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30 not found: ID does not exist" Mar 12 12:21:00.155067 master-0 kubenswrapper[4102]: I0312 12:21:00.155064 4102 scope.go:117] "RemoveContainer" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.155438 master-0 kubenswrapper[4102]: I0312 12:21:00.155397 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} err="failed to get container status \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": rpc error: code = NotFound desc = could not find container \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": container with ID starting with f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049 not found: ID does not exist" Mar 12 12:21:00.155438 master-0 kubenswrapper[4102]: I0312 12:21:00.155435 4102 scope.go:117] "RemoveContainer" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" Mar 12 12:21:00.155957 master-0 kubenswrapper[4102]: I0312 12:21:00.155924 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} err="failed to get container status \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": rpc error: code = NotFound desc = could not find container \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": container with ID starting with 2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff not found: ID does not exist" Mar 12 12:21:00.155995 master-0 kubenswrapper[4102]: I0312 12:21:00.155962 4102 scope.go:117] "RemoveContainer" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" Mar 12 12:21:00.156274 master-0 kubenswrapper[4102]: I0312 12:21:00.156249 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} err="failed to get container status \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": rpc error: code = NotFound desc = could not find container \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": container with ID starting with 317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28 not found: ID does not exist" Mar 12 12:21:00.156274 master-0 kubenswrapper[4102]: I0312 12:21:00.156270 4102 scope.go:117] "RemoveContainer" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" Mar 12 12:21:00.156771 master-0 kubenswrapper[4102]: I0312 12:21:00.156741 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} err="failed to get container status \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": rpc error: code = NotFound desc = could not find container \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": container with ID starting with 07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a not found: ID does not exist" Mar 12 12:21:00.156815 master-0 kubenswrapper[4102]: I0312 12:21:00.156776 4102 scope.go:117] "RemoveContainer" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.157061 master-0 kubenswrapper[4102]: I0312 12:21:00.157030 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} err="failed to get container status \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": rpc error: code = NotFound desc = could not find container \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": container with ID starting with 9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04 not found: ID does not exist" Mar 12 12:21:00.157105 master-0 kubenswrapper[4102]: I0312 12:21:00.157064 4102 scope.go:117] "RemoveContainer" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.157510 master-0 kubenswrapper[4102]: I0312 12:21:00.157455 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} err="failed to get container status \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": rpc error: code = NotFound desc = could not find container \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": container with ID starting with abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32 not found: ID does not exist" Mar 12 12:21:00.157562 master-0 kubenswrapper[4102]: I0312 12:21:00.157512 4102 scope.go:117] "RemoveContainer" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.157818 master-0 kubenswrapper[4102]: I0312 12:21:00.157785 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} err="failed to get container status \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": rpc error: code = NotFound desc = could not find container \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": container with ID starting with 3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c not found: ID does not exist" Mar 12 12:21:00.157862 master-0 kubenswrapper[4102]: I0312 12:21:00.157820 4102 scope.go:117] "RemoveContainer" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.158230 master-0 kubenswrapper[4102]: I0312 12:21:00.158205 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} err="failed to get container status \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": rpc error: code = NotFound desc = could not find container \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": container with ID starting with c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07 not found: ID does not exist" Mar 12 12:21:00.158273 master-0 kubenswrapper[4102]: I0312 12:21:00.158228 4102 scope.go:117] "RemoveContainer" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.158659 master-0 kubenswrapper[4102]: I0312 12:21:00.158626 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} err="failed to get container status \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": rpc error: code = NotFound desc = could not find container \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": container with ID starting with 84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30 not found: ID does not exist" Mar 12 12:21:00.158716 master-0 kubenswrapper[4102]: I0312 12:21:00.158663 4102 scope.go:117] "RemoveContainer" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.159026 master-0 kubenswrapper[4102]: I0312 12:21:00.158995 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} err="failed to get container status \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": rpc error: code = NotFound desc = could not find container \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": container with ID starting with f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049 not found: ID does not exist" Mar 12 12:21:00.159077 master-0 kubenswrapper[4102]: I0312 12:21:00.159029 4102 scope.go:117] "RemoveContainer" containerID="2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff" Mar 12 12:21:00.159670 master-0 kubenswrapper[4102]: I0312 12:21:00.159638 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff"} err="failed to get container status \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": rpc error: code = NotFound desc = could not find container \"2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff\": container with ID starting with 2918991ec885ac1c05d85ed64faf6c43019b54b362231b1352649f2a3b3ac8ff not found: ID does not exist" Mar 12 12:21:00.159670 master-0 kubenswrapper[4102]: I0312 12:21:00.159665 4102 scope.go:117] "RemoveContainer" containerID="317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28" Mar 12 12:21:00.160055 master-0 kubenswrapper[4102]: I0312 12:21:00.159998 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28"} err="failed to get container status \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": rpc error: code = NotFound desc = could not find container \"317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28\": container with ID starting with 317bf660c88a2d91466e1adc40f8d269f10bee24348a669ef04079fee57cda28 not found: ID does not exist" Mar 12 12:21:00.160055 master-0 kubenswrapper[4102]: I0312 12:21:00.160040 4102 scope.go:117] "RemoveContainer" containerID="07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a" Mar 12 12:21:00.160325 master-0 kubenswrapper[4102]: I0312 12:21:00.160295 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a"} err="failed to get container status \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": rpc error: code = NotFound desc = could not find container \"07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a\": container with ID starting with 07a3cb8d8c8dca997819e9ba174d87dec5aa6567a6c9a863707a734c7828d15a not found: ID does not exist" Mar 12 12:21:00.160325 master-0 kubenswrapper[4102]: I0312 12:21:00.160321 4102 scope.go:117] "RemoveContainer" containerID="9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04" Mar 12 12:21:00.160841 master-0 kubenswrapper[4102]: I0312 12:21:00.160813 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04"} err="failed to get container status \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": rpc error: code = NotFound desc = could not find container \"9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04\": container with ID starting with 9e0155a432127380970087fd1e89cb17b9464eb75c218c45132a36fdd0115a04 not found: ID does not exist" Mar 12 12:21:00.160841 master-0 kubenswrapper[4102]: I0312 12:21:00.160839 4102 scope.go:117] "RemoveContainer" containerID="abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32" Mar 12 12:21:00.161168 master-0 kubenswrapper[4102]: I0312 12:21:00.161146 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32"} err="failed to get container status \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": rpc error: code = NotFound desc = could not find container \"abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32\": container with ID starting with abb5fca05a13b70d9bda9a5b934bfbadedcb66bdcf9ead2443fd4e25d2deee32 not found: ID does not exist" Mar 12 12:21:00.161168 master-0 kubenswrapper[4102]: I0312 12:21:00.161165 4102 scope.go:117] "RemoveContainer" containerID="3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c" Mar 12 12:21:00.161521 master-0 kubenswrapper[4102]: I0312 12:21:00.161453 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c"} err="failed to get container status \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": rpc error: code = NotFound desc = could not find container \"3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c\": container with ID starting with 3ace8e25bb154f5eba24594a09f5325b70b8b25e5face1ed80c18e6dc979971c not found: ID does not exist" Mar 12 12:21:00.161578 master-0 kubenswrapper[4102]: I0312 12:21:00.161525 4102 scope.go:117] "RemoveContainer" containerID="c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07" Mar 12 12:21:00.161851 master-0 kubenswrapper[4102]: I0312 12:21:00.161826 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07"} err="failed to get container status \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": rpc error: code = NotFound desc = could not find container \"c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07\": container with ID starting with c5924dc1153d10dfcbbd03c8bb24fbb5cd5b8b259ec7fb56fa13ccc77ea77a07 not found: ID does not exist" Mar 12 12:21:00.161851 master-0 kubenswrapper[4102]: I0312 12:21:00.161848 4102 scope.go:117] "RemoveContainer" containerID="84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30" Mar 12 12:21:00.162616 master-0 kubenswrapper[4102]: I0312 12:21:00.162358 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30"} err="failed to get container status \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": rpc error: code = NotFound desc = could not find container \"84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30\": container with ID starting with 84078376270f3a740ffbf9cbd764aa26a68fdcc754a61897fa64c7db3ec3bd30 not found: ID does not exist" Mar 12 12:21:00.162664 master-0 kubenswrapper[4102]: I0312 12:21:00.162623 4102 scope.go:117] "RemoveContainer" containerID="f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049" Mar 12 12:21:00.162953 master-0 kubenswrapper[4102]: I0312 12:21:00.162921 4102 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049"} err="failed to get container status \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": rpc error: code = NotFound desc = could not find container \"f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049\": container with ID starting with f75a6fa073617ec1b05c1676c2817071d74dbc4795426b4af49b7ea4258b8049 not found: ID does not exist" Mar 12 12:21:01.039677 master-0 kubenswrapper[4102]: I0312 12:21:01.039251 4102 generic.go:334] "Generic (PLEG): container finished" podID="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" containerID="0f612f745cd190b544d29d6db2a18fd47dfd753a5c71d5de2a3d0a726aefe224" exitCode=0 Mar 12 12:21:01.039677 master-0 kubenswrapper[4102]: I0312 12:21:01.039402 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerDied","Data":"0f612f745cd190b544d29d6db2a18fd47dfd753a5c71d5de2a3d0a726aefe224"} Mar 12 12:21:01.040461 master-0 kubenswrapper[4102]: I0312 12:21:01.039728 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a"} Mar 12 12:21:01.839186 master-0 kubenswrapper[4102]: I0312 12:21:01.839075 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:01.839315 master-0 kubenswrapper[4102]: I0312 12:21:01.839214 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:01.839392 master-0 kubenswrapper[4102]: E0312 12:21:01.839335 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:21:01.839507 master-0 kubenswrapper[4102]: E0312 12:21:01.839429 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:21:01.845367 master-0 kubenswrapper[4102]: I0312 12:21:01.845307 4102 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7b07b83-cade-4b52-b568-5c7e9d7c496b" path="/var/lib/kubelet/pods/b7b07b83-cade-4b52-b568-5c7e9d7c496b/volumes" Mar 12 12:21:02.051620 master-0 kubenswrapper[4102]: I0312 12:21:02.051464 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"7cc8dbf5c320b6b540f75c8f5f4d9cb877013eb5fceba97b6b8cbd9e8d957f58"} Mar 12 12:21:02.051620 master-0 kubenswrapper[4102]: I0312 12:21:02.051578 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"00926042c3fd8a2c2320f20068a3583e1f5f0bd6f47f82ed8d5996083578ca4a"} Mar 12 12:21:02.051620 master-0 kubenswrapper[4102]: I0312 12:21:02.051621 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"60321f8c87557f6db2176b31ff576442419a4fb01d0ad6707c7c7cc65d563d35"} Mar 12 12:21:02.052582 master-0 kubenswrapper[4102]: I0312 12:21:02.051645 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"080b89f0b094a87ba5c73937ed4168eecc0a1552a8e245b85823463caa208d30"} Mar 12 12:21:02.052582 master-0 kubenswrapper[4102]: I0312 12:21:02.051667 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"263f87705b784d25ea597b41b2078c443059878b74626d36538d29cea8ea6fbf"} Mar 12 12:21:02.052582 master-0 kubenswrapper[4102]: I0312 12:21:02.051688 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"96af38acc88d6db905fe227ed9a66b6b8372a51f5dde92276e20ebb1eb8def02"} Mar 12 12:21:02.690862 master-0 kubenswrapper[4102]: I0312 12:21:02.690766 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:02.691123 master-0 kubenswrapper[4102]: E0312 12:21:02.690952 4102 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:02.691123 master-0 kubenswrapper[4102]: E0312 12:21:02.691025 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:06.690999171 +0000 UTC m=+171.623775626 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:03.839221 master-0 kubenswrapper[4102]: I0312 12:21:03.839142 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:03.839829 master-0 kubenswrapper[4102]: E0312 12:21:03.839282 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:21:03.839829 master-0 kubenswrapper[4102]: I0312 12:21:03.839062 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:03.839829 master-0 kubenswrapper[4102]: E0312 12:21:03.839525 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:21:04.065866 master-0 kubenswrapper[4102]: I0312 12:21:04.065809 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"f22dc588d2e654fa26957124f480421892925428ad09a329a719a6e359930a9c"} Mar 12 12:21:05.543514 master-0 kubenswrapper[4102]: I0312 12:21:05.543170 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:05.543514 master-0 kubenswrapper[4102]: E0312 12:21:05.543374 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 12 12:21:05.543514 master-0 kubenswrapper[4102]: E0312 12:21:05.543390 4102 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 12 12:21:05.543514 master-0 kubenswrapper[4102]: E0312 12:21:05.543401 4102 projected.go:194] Error preparing data for projected volume kube-api-access-jxt29 for pod openshift-network-diagnostics/network-check-target-dfz7x: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:21:05.543514 master-0 kubenswrapper[4102]: E0312 12:21:05.543447 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29 podName:269d77d9-815e-4324-8827-1ce429063ed1 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.543434278 +0000 UTC m=+142.476210693 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxt29" (UniqueName: "kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29") pod "network-check-target-dfz7x" (UID: "269d77d9-815e-4324-8827-1ce429063ed1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 12 12:21:05.838746 master-0 kubenswrapper[4102]: I0312 12:21:05.838619 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:05.839517 master-0 kubenswrapper[4102]: E0312 12:21:05.839439 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:21:05.839517 master-0 kubenswrapper[4102]: I0312 12:21:05.839505 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:05.839711 master-0 kubenswrapper[4102]: E0312 12:21:05.839605 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:21:07.087349 master-0 kubenswrapper[4102]: I0312 12:21:07.086638 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" event={"ID":"f04121eb-5c7b-42cd-a2e2-26cf1c67593d","Type":"ContainerStarted","Data":"97fe845539e4654de20fc5c501ba7d00cac1a717c2e953e9f9cdbbaae230dcd2"} Mar 12 12:21:07.087349 master-0 kubenswrapper[4102]: I0312 12:21:07.087048 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:07.087349 master-0 kubenswrapper[4102]: I0312 12:21:07.087073 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:07.119516 master-0 kubenswrapper[4102]: I0312 12:21:07.119442 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:07.126619 master-0 kubenswrapper[4102]: I0312 12:21:07.126271 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" podStartSLOduration=8.126246862 podStartE2EDuration="8.126246862s" podCreationTimestamp="2026-03-12 12:20:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:07.125783501 +0000 UTC m=+112.058559966" watchObservedRunningTime="2026-03-12 12:21:07.126246862 +0000 UTC m=+112.059023277" Mar 12 12:21:07.631670 master-0 kubenswrapper[4102]: I0312 12:21:07.631599 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dfz7x"] Mar 12 12:21:07.631863 master-0 kubenswrapper[4102]: I0312 12:21:07.631761 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:07.631933 master-0 kubenswrapper[4102]: E0312 12:21:07.631904 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:21:07.639908 master-0 kubenswrapper[4102]: I0312 12:21:07.639626 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4m9jh"] Mar 12 12:21:07.640516 master-0 kubenswrapper[4102]: I0312 12:21:07.640198 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:07.643811 master-0 kubenswrapper[4102]: E0312 12:21:07.642047 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:21:08.091694 master-0 kubenswrapper[4102]: I0312 12:21:08.091584 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:08.120448 master-0 kubenswrapper[4102]: I0312 12:21:08.120388 4102 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:08.839272 master-0 kubenswrapper[4102]: I0312 12:21:08.838953 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:08.839549 master-0 kubenswrapper[4102]: E0312 12:21:08.839323 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:21:09.838792 master-0 kubenswrapper[4102]: I0312 12:21:09.838721 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:09.839360 master-0 kubenswrapper[4102]: E0312 12:21:09.838916 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:21:10.839857 master-0 kubenswrapper[4102]: I0312 12:21:10.839767 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:10.840422 master-0 kubenswrapper[4102]: E0312 12:21:10.840007 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-dfz7x" podUID="269d77d9-815e-4324-8827-1ce429063ed1" Mar 12 12:21:10.853446 master-0 kubenswrapper[4102]: I0312 12:21:10.853365 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 12 12:21:11.839606 master-0 kubenswrapper[4102]: I0312 12:21:11.839369 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:11.839863 master-0 kubenswrapper[4102]: E0312 12:21:11.839651 4102 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4m9jh" podUID="e64bc838-280e-4231-9732-1adb69fed0bc" Mar 12 12:21:11.850784 master-0 kubenswrapper[4102]: I0312 12:21:11.850714 4102 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 12 12:21:11.851709 master-0 kubenswrapper[4102]: I0312 12:21:11.850919 4102 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 12 12:21:11.899924 master-0 kubenswrapper[4102]: I0312 12:21:11.896817 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr"] Mar 12 12:21:11.901444 master-0 kubenswrapper[4102]: I0312 12:21:11.901053 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-vpss8"] Mar 12 12:21:11.901444 master-0 kubenswrapper[4102]: I0312 12:21:11.901328 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:11.909613 master-0 kubenswrapper[4102]: I0312 12:21:11.908815 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 12 12:21:11.909901 master-0 kubenswrapper[4102]: I0312 12:21:11.909666 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 12 12:21:11.909901 master-0 kubenswrapper[4102]: I0312 12:21:11.909836 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 12 12:21:11.913854 master-0 kubenswrapper[4102]: I0312 12:21:11.913798 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-rgstx"] Mar 12 12:21:11.914158 master-0 kubenswrapper[4102]: I0312 12:21:11.914116 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9"] Mar 12 12:21:11.914427 master-0 kubenswrapper[4102]: I0312 12:21:11.914387 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v"] Mar 12 12:21:11.914747 master-0 kubenswrapper[4102]: I0312 12:21:11.914705 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85"] Mar 12 12:21:11.915179 master-0 kubenswrapper[4102]: I0312 12:21:11.915110 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 12 12:21:11.915179 master-0 kubenswrapper[4102]: I0312 12:21:11.915179 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k"] Mar 12 12:21:11.915687 master-0 kubenswrapper[4102]: I0312 12:21:11.915644 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:11.915875 master-0 kubenswrapper[4102]: I0312 12:21:11.915823 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv"] Mar 12 12:21:11.916180 master-0 kubenswrapper[4102]: I0312 12:21:11.916142 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:11.916298 master-0 kubenswrapper[4102]: I0312 12:21:11.916275 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:11.925546 master-0 kubenswrapper[4102]: I0312 12:21:11.916817 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:11.925546 master-0 kubenswrapper[4102]: I0312 12:21:11.917403 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:11.925546 master-0 kubenswrapper[4102]: I0312 12:21:11.917799 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:11.925546 master-0 kubenswrapper[4102]: I0312 12:21:11.918174 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:11.929532 master-0 kubenswrapper[4102]: I0312 12:21:11.929458 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 12:21:11.929712 master-0 kubenswrapper[4102]: I0312 12:21:11.929597 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.929712 master-0 kubenswrapper[4102]: I0312 12:21:11.929456 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.929957 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg"] Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.930652 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b"] Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.930856 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.931517 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.931657 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.931841 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932057 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932335 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932347 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932428 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932551 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932779 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932887 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.932921 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.934631 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:11.938545 master-0 kubenswrapper[4102]: I0312 12:21:11.936722 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 12:21:11.941195 master-0 kubenswrapper[4102]: I0312 12:21:11.941138 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 12:21:11.941654 master-0 kubenswrapper[4102]: I0312 12:21:11.941612 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-l8x6p"] Mar 12 12:21:11.942328 master-0 kubenswrapper[4102]: I0312 12:21:11.942296 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.942928 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.943138 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.943380 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.943934 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.944290 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.944375 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.944553 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.944690 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.944865 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.944916 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-xpzn2"] Mar 12 12:21:11.946521 master-0 kubenswrapper[4102]: I0312 12:21:11.945762 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:11.947091 master-0 kubenswrapper[4102]: I0312 12:21:11.946668 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.947091 master-0 kubenswrapper[4102]: I0312 12:21:11.946947 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 12:21:11.947091 master-0 kubenswrapper[4102]: I0312 12:21:11.947079 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp"] Mar 12 12:21:11.965297 master-0 kubenswrapper[4102]: I0312 12:21:11.964960 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:11.966909 master-0 kubenswrapper[4102]: I0312 12:21:11.947052 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 12:21:11.966909 master-0 kubenswrapper[4102]: I0312 12:21:11.966447 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 12:21:11.983335 master-0 kubenswrapper[4102]: I0312 12:21:11.983254 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 12:21:11.983763 master-0 kubenswrapper[4102]: I0312 12:21:11.983733 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.983905 master-0 kubenswrapper[4102]: I0312 12:21:11.983881 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 12:21:11.984231 master-0 kubenswrapper[4102]: I0312 12:21:11.984206 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 12:21:11.984295 master-0 kubenswrapper[4102]: I0312 12:21:11.984283 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 12:21:11.984381 master-0 kubenswrapper[4102]: I0312 12:21:11.984199 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.984524 master-0 kubenswrapper[4102]: I0312 12:21:11.984499 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h"] Mar 12 12:21:11.984642 master-0 kubenswrapper[4102]: I0312 12:21:11.984609 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 12:21:11.984826 master-0 kubenswrapper[4102]: I0312 12:21:11.984739 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=1.9847191039999998 podStartE2EDuration="1.984719104s" podCreationTimestamp="2026-03-12 12:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:11.984123449 +0000 UTC m=+116.916899864" watchObservedRunningTime="2026-03-12 12:21:11.984719104 +0000 UTC m=+116.917495509" Mar 12 12:21:11.984826 master-0 kubenswrapper[4102]: I0312 12:21:11.984820 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4"] Mar 12 12:21:11.985123 master-0 kubenswrapper[4102]: I0312 12:21:11.985095 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:11.985225 master-0 kubenswrapper[4102]: I0312 12:21:11.984348 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 12:21:11.985356 master-0 kubenswrapper[4102]: I0312 12:21:11.985326 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp"] Mar 12 12:21:11.985356 master-0 kubenswrapper[4102]: I0312 12:21:11.985351 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:11.985693 master-0 kubenswrapper[4102]: I0312 12:21:11.984410 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 12:21:11.985830 master-0 kubenswrapper[4102]: I0312 12:21:11.985805 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:11.988021 master-0 kubenswrapper[4102]: I0312 12:21:11.987968 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.988138 master-0 kubenswrapper[4102]: I0312 12:21:11.988064 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd"] Mar 12 12:21:11.988282 master-0 kubenswrapper[4102]: I0312 12:21:11.988247 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 12 12:21:11.988380 master-0 kubenswrapper[4102]: I0312 12:21:11.988349 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv"] Mar 12 12:21:11.988380 master-0 kubenswrapper[4102]: I0312 12:21:11.988373 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 12:21:11.989005 master-0 kubenswrapper[4102]: I0312 12:21:11.988502 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 12 12:21:11.989005 master-0 kubenswrapper[4102]: I0312 12:21:11.988679 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h"] Mar 12 12:21:11.989005 master-0 kubenswrapper[4102]: I0312 12:21:11.989002 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:11.990382 master-0 kubenswrapper[4102]: I0312 12:21:11.990202 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:11.990382 master-0 kubenswrapper[4102]: I0312 12:21:11.990314 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:11.990591 master-0 kubenswrapper[4102]: I0312 12:21:11.990411 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 12:21:11.990591 master-0 kubenswrapper[4102]: I0312 12:21:11.990510 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 12 12:21:11.990679 master-0 kubenswrapper[4102]: I0312 12:21:11.990647 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 12 12:21:11.992572 master-0 kubenswrapper[4102]: I0312 12:21:11.992400 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4"] Mar 12 12:21:11.992847 master-0 kubenswrapper[4102]: I0312 12:21:11.992779 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:11.993574 master-0 kubenswrapper[4102]: I0312 12:21:11.993549 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf"] Mar 12 12:21:11.994681 master-0 kubenswrapper[4102]: I0312 12:21:11.993831 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:11.994681 master-0 kubenswrapper[4102]: I0312 12:21:11.993979 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx"] Mar 12 12:21:11.994681 master-0 kubenswrapper[4102]: I0312 12:21:11.994525 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:11.994841 master-0 kubenswrapper[4102]: I0312 12:21:11.994683 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr"] Mar 12 12:21:11.996238 master-0 kubenswrapper[4102]: I0312 12:21:11.996199 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-vpss8"] Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:11.997805 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:11.999541 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.000610 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.000921 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.001591 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv"] Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.001651 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v"] Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.002018 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.002098 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85"] Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.002227 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.002300 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.002587 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.002719 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.003343 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 12:21:12.005565 master-0 kubenswrapper[4102]: I0312 12:21:12.004638 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 12:21:12.009828 master-0 kubenswrapper[4102]: I0312 12:21:12.009799 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 12:21:12.012514 master-0 kubenswrapper[4102]: I0312 12:21:12.012286 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv"] Mar 12 12:21:12.012978 master-0 kubenswrapper[4102]: I0312 12:21:12.012870 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg"] Mar 12 12:21:12.013063 master-0 kubenswrapper[4102]: I0312 12:21:12.013022 4102 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-xqmw9"] Mar 12 12:21:12.015851 master-0 kubenswrapper[4102]: I0312 12:21:12.015828 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 12:21:12.016128 master-0 kubenswrapper[4102]: I0312 12:21:12.015881 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 12:21:12.019823 master-0 kubenswrapper[4102]: I0312 12:21:12.018951 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-xpzn2"] Mar 12 12:21:12.019823 master-0 kubenswrapper[4102]: I0312 12:21:12.019031 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-rgstx"] Mar 12 12:21:12.019823 master-0 kubenswrapper[4102]: I0312 12:21:12.019044 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h"] Mar 12 12:21:12.019823 master-0 kubenswrapper[4102]: I0312 12:21:12.019056 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-l8x6p"] Mar 12 12:21:12.019823 master-0 kubenswrapper[4102]: I0312 12:21:12.019178 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9"] Mar 12 12:21:12.019823 master-0 kubenswrapper[4102]: I0312 12:21:12.019199 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.020268 master-0 kubenswrapper[4102]: I0312 12:21:12.020248 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 12:21:12.020427 master-0 kubenswrapper[4102]: I0312 12:21:12.020386 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 12:21:12.020603 master-0 kubenswrapper[4102]: I0312 12:21:12.020276 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:21:12.020741 master-0 kubenswrapper[4102]: I0312 12:21:12.020711 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd"] Mar 12 12:21:12.020741 master-0 kubenswrapper[4102]: I0312 12:21:12.020319 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 12:21:12.020836 master-0 kubenswrapper[4102]: I0312 12:21:12.020823 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:21:12.020908 master-0 kubenswrapper[4102]: I0312 12:21:12.020644 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:21:12.021009 master-0 kubenswrapper[4102]: I0312 12:21:12.020982 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 12:21:12.021433 master-0 kubenswrapper[4102]: I0312 12:21:12.021398 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp"] Mar 12 12:21:12.021862 master-0 kubenswrapper[4102]: I0312 12:21:12.021844 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 12:21:12.025522 master-0 kubenswrapper[4102]: I0312 12:21:12.023526 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k"] Mar 12 12:21:12.036876 master-0 kubenswrapper[4102]: I0312 12:21:12.036808 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf"] Mar 12 12:21:12.037378 master-0 kubenswrapper[4102]: I0312 12:21:12.037288 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 12:21:12.037442 master-0 kubenswrapper[4102]: I0312 12:21:12.037421 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 12:21:12.037773 master-0 kubenswrapper[4102]: I0312 12:21:12.037750 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 12:21:12.049695 master-0 kubenswrapper[4102]: I0312 12:21:12.049651 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 12:21:12.050274 master-0 kubenswrapper[4102]: I0312 12:21:12.050258 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 12 12:21:12.051597 master-0 kubenswrapper[4102]: I0312 12:21:12.051564 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 12:21:12.052586 master-0 kubenswrapper[4102]: I0312 12:21:12.052567 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp"] Mar 12 12:21:12.052681 master-0 kubenswrapper[4102]: I0312 12:21:12.052670 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b"] Mar 12 12:21:12.052755 master-0 kubenswrapper[4102]: I0312 12:21:12.052745 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4"] Mar 12 12:21:12.055512 master-0 kubenswrapper[4102]: I0312 12:21:12.054522 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h"] Mar 12 12:21:12.055512 master-0 kubenswrapper[4102]: I0312 12:21:12.055290 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx"] Mar 12 12:21:12.059498 master-0 kubenswrapper[4102]: I0312 12:21:12.056203 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4"] Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.082948 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083006 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083025 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083040 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083057 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083072 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083088 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083102 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083115 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083129 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083143 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083160 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083173 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083189 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:12.083512 master-0 kubenswrapper[4102]: I0312 12:21:12.083205 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083220 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083233 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083261 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083276 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083290 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083307 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083321 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083337 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083356 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083371 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083385 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083399 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083415 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.084220 master-0 kubenswrapper[4102]: I0312 12:21:12.083430 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083452 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083469 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083500 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083516 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083530 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083545 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083569 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083583 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.084788 master-0 kubenswrapper[4102]: I0312 12:21:12.083597 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:12.184861 master-0 kubenswrapper[4102]: I0312 12:21:12.184828 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.185025 master-0 kubenswrapper[4102]: I0312 12:21:12.185012 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.185103 master-0 kubenswrapper[4102]: I0312 12:21:12.185092 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.185185 master-0 kubenswrapper[4102]: I0312 12:21:12.185172 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.185258 master-0 kubenswrapper[4102]: I0312 12:21:12.185247 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.185342 master-0 kubenswrapper[4102]: I0312 12:21:12.185331 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.185407 master-0 kubenswrapper[4102]: I0312 12:21:12.185396 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.185500 master-0 kubenswrapper[4102]: I0312 12:21:12.185461 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.185595 master-0 kubenswrapper[4102]: I0312 12:21:12.185580 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:12.185679 master-0 kubenswrapper[4102]: I0312 12:21:12.185664 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.185762 master-0 kubenswrapper[4102]: E0312 12:21:12.185713 4102 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:12.185892 master-0 kubenswrapper[4102]: I0312 12:21:12.185861 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.186024 master-0 kubenswrapper[4102]: E0312 12:21:12.185878 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.685852788 +0000 UTC m=+117.618629433 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:12.186173 master-0 kubenswrapper[4102]: I0312 12:21:12.186136 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.186312 master-0 kubenswrapper[4102]: I0312 12:21:12.186276 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.186434 master-0 kubenswrapper[4102]: I0312 12:21:12.186416 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:12.186607 master-0 kubenswrapper[4102]: I0312 12:21:12.186570 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.186726 master-0 kubenswrapper[4102]: I0312 12:21:12.186713 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.186858 master-0 kubenswrapper[4102]: I0312 12:21:12.186845 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.186964 master-0 kubenswrapper[4102]: I0312 12:21:12.186951 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.187126 master-0 kubenswrapper[4102]: I0312 12:21:12.187113 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:12.187221 master-0 kubenswrapper[4102]: I0312 12:21:12.187210 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.187428 master-0 kubenswrapper[4102]: I0312 12:21:12.187400 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.187554 master-0 kubenswrapper[4102]: I0312 12:21:12.187044 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.187619 master-0 kubenswrapper[4102]: E0312 12:21:12.186424 4102 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:12.187668 master-0 kubenswrapper[4102]: E0312 12:21:12.187619 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.687604 +0000 UTC m=+117.620380425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:12.187668 master-0 kubenswrapper[4102]: I0312 12:21:12.187508 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.187768 master-0 kubenswrapper[4102]: I0312 12:21:12.187546 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:12.187866 master-0 kubenswrapper[4102]: I0312 12:21:12.187853 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.187959 master-0 kubenswrapper[4102]: I0312 12:21:12.187947 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:12.188032 master-0 kubenswrapper[4102]: E0312 12:21:12.188015 4102 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:12.188087 master-0 kubenswrapper[4102]: E0312 12:21:12.188054 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.68804303 +0000 UTC m=+117.620819445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:12.188167 master-0 kubenswrapper[4102]: I0312 12:21:12.188151 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.188279 master-0 kubenswrapper[4102]: I0312 12:21:12.188263 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.188396 master-0 kubenswrapper[4102]: I0312 12:21:12.188365 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.188511 master-0 kubenswrapper[4102]: I0312 12:21:12.188403 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.188574 master-0 kubenswrapper[4102]: I0312 12:21:12.188498 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.188574 master-0 kubenswrapper[4102]: I0312 12:21:12.188556 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.188658 master-0 kubenswrapper[4102]: I0312 12:21:12.188577 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.188658 master-0 kubenswrapper[4102]: I0312 12:21:12.188597 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.188658 master-0 kubenswrapper[4102]: I0312 12:21:12.188646 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.188759 master-0 kubenswrapper[4102]: I0312 12:21:12.188664 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.188759 master-0 kubenswrapper[4102]: I0312 12:21:12.188681 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.188759 master-0 kubenswrapper[4102]: I0312 12:21:12.188715 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.188759 master-0 kubenswrapper[4102]: I0312 12:21:12.188743 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.188759 master-0 kubenswrapper[4102]: I0312 12:21:12.188763 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.188951 master-0 kubenswrapper[4102]: I0312 12:21:12.188779 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.188951 master-0 kubenswrapper[4102]: I0312 12:21:12.188778 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.189667 master-0 kubenswrapper[4102]: I0312 12:21:12.189641 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.189795 master-0 kubenswrapper[4102]: I0312 12:21:12.189761 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.189864 master-0 kubenswrapper[4102]: I0312 12:21:12.189809 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.189864 master-0 kubenswrapper[4102]: I0312 12:21:12.189835 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.189864 master-0 kubenswrapper[4102]: I0312 12:21:12.189851 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.189981 master-0 kubenswrapper[4102]: I0312 12:21:12.189863 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:12.190016 master-0 kubenswrapper[4102]: I0312 12:21:12.189986 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.190068 master-0 kubenswrapper[4102]: I0312 12:21:12.190012 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.190068 master-0 kubenswrapper[4102]: I0312 12:21:12.190036 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.190068 master-0 kubenswrapper[4102]: I0312 12:21:12.190056 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:12.190173 master-0 kubenswrapper[4102]: I0312 12:21:12.190076 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.190173 master-0 kubenswrapper[4102]: I0312 12:21:12.190099 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.190173 master-0 kubenswrapper[4102]: I0312 12:21:12.190119 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.190173 master-0 kubenswrapper[4102]: I0312 12:21:12.190138 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:12.190173 master-0 kubenswrapper[4102]: I0312 12:21:12.190158 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190179 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190199 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190218 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190237 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190259 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190264 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190276 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190296 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: I0312 12:21:12.190316 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.190344 master-0 kubenswrapper[4102]: E0312 12:21:12.190343 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: I0312 12:21:12.190341 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: E0312 12:21:12.190386 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.690372386 +0000 UTC m=+117.623148801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: E0312 12:21:12.190531 4102 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: E0312 12:21:12.190574 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.690552531 +0000 UTC m=+117.623328946 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: E0312 12:21:12.190527 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: E0312 12:21:12.190621 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.690613622 +0000 UTC m=+117.623390037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: I0312 12:21:12.190648 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.190665 master-0 kubenswrapper[4102]: E0312 12:21:12.190661 4102 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:12.190896 master-0 kubenswrapper[4102]: I0312 12:21:12.190671 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.190896 master-0 kubenswrapper[4102]: E0312 12:21:12.190775 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.690766956 +0000 UTC m=+117.623543371 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:12.190896 master-0 kubenswrapper[4102]: I0312 12:21:12.190802 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.190896 master-0 kubenswrapper[4102]: I0312 12:21:12.190825 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.190896 master-0 kubenswrapper[4102]: I0312 12:21:12.190846 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.190896 master-0 kubenswrapper[4102]: I0312 12:21:12.190866 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.191068 master-0 kubenswrapper[4102]: I0312 12:21:12.190903 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:12.191068 master-0 kubenswrapper[4102]: I0312 12:21:12.190923 4102 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.191068 master-0 kubenswrapper[4102]: I0312 12:21:12.190944 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.191068 master-0 kubenswrapper[4102]: I0312 12:21:12.190962 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:12.191565 master-0 kubenswrapper[4102]: I0312 12:21:12.191537 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.191618 master-0 kubenswrapper[4102]: E0312 12:21:12.191601 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:12.191648 master-0 kubenswrapper[4102]: E0312 12:21:12.191627 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.691618366 +0000 UTC m=+117.624394781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:12.192052 master-0 kubenswrapper[4102]: I0312 12:21:12.192037 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.192659 master-0 kubenswrapper[4102]: I0312 12:21:12.192596 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.195412 master-0 kubenswrapper[4102]: I0312 12:21:12.195384 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.195606 master-0 kubenswrapper[4102]: I0312 12:21:12.195574 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.195713 master-0 kubenswrapper[4102]: I0312 12:21:12.195691 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.198029 master-0 kubenswrapper[4102]: I0312 12:21:12.197998 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.199126 master-0 kubenswrapper[4102]: I0312 12:21:12.199093 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.199882 master-0 kubenswrapper[4102]: I0312 12:21:12.199853 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.218940 master-0 kubenswrapper[4102]: I0312 12:21:12.218913 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.221339 master-0 kubenswrapper[4102]: I0312 12:21:12.221275 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.222142 master-0 kubenswrapper[4102]: I0312 12:21:12.222121 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:12.222440 master-0 kubenswrapper[4102]: I0312 12:21:12.222412 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:12.224428 master-0 kubenswrapper[4102]: I0312 12:21:12.224401 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:12.224596 master-0 kubenswrapper[4102]: I0312 12:21:12.224460 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.225571 master-0 kubenswrapper[4102]: I0312 12:21:12.225552 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.226397 master-0 kubenswrapper[4102]: I0312 12:21:12.226354 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.226757 master-0 kubenswrapper[4102]: I0312 12:21:12.226721 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.226946 master-0 kubenswrapper[4102]: I0312 12:21:12.226917 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:12.230183 master-0 kubenswrapper[4102]: I0312 12:21:12.230154 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:12.231257 master-0 kubenswrapper[4102]: I0312 12:21:12.231196 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.231537 master-0 kubenswrapper[4102]: I0312 12:21:12.231507 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.238111 master-0 kubenswrapper[4102]: I0312 12:21:12.237128 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.291677 master-0 kubenswrapper[4102]: I0312 12:21:12.291623 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.291813 master-0 kubenswrapper[4102]: I0312 12:21:12.291683 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.291813 master-0 kubenswrapper[4102]: I0312 12:21:12.291736 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.291890 master-0 kubenswrapper[4102]: I0312 12:21:12.291768 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.291890 master-0 kubenswrapper[4102]: I0312 12:21:12.291843 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.292505 master-0 kubenswrapper[4102]: E0312 12:21:12.292438 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:12.292571 master-0 kubenswrapper[4102]: E0312 12:21:12.292546 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.792520056 +0000 UTC m=+117.725296481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:12.292673 master-0 kubenswrapper[4102]: I0312 12:21:12.292640 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.292746 master-0 kubenswrapper[4102]: I0312 12:21:12.292727 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.292800 master-0 kubenswrapper[4102]: I0312 12:21:12.292774 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.292842 master-0 kubenswrapper[4102]: I0312 12:21:12.292810 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.292888 master-0 kubenswrapper[4102]: I0312 12:21:12.292856 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.292949 master-0 kubenswrapper[4102]: I0312 12:21:12.292890 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.292994 master-0 kubenswrapper[4102]: I0312 12:21:12.292969 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.293041 master-0 kubenswrapper[4102]: I0312 12:21:12.292999 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.293084 master-0 kubenswrapper[4102]: I0312 12:21:12.293051 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.293126 master-0 kubenswrapper[4102]: I0312 12:21:12.293079 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.293126 master-0 kubenswrapper[4102]: I0312 12:21:12.293118 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.294133 master-0 kubenswrapper[4102]: I0312 12:21:12.294100 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.294552 master-0 kubenswrapper[4102]: I0312 12:21:12.294525 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.294788 master-0 kubenswrapper[4102]: I0312 12:21:12.294765 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.294941 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295048 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295089 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295122 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295155 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295192 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295223 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295268 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295290 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295315 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295354 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295386 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295419 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.295453 master-0 kubenswrapper[4102]: I0312 12:21:12.295453 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.296114 master-0 kubenswrapper[4102]: I0312 12:21:12.295523 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.296114 master-0 kubenswrapper[4102]: I0312 12:21:12.295587 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.296114 master-0 kubenswrapper[4102]: I0312 12:21:12.295634 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.296114 master-0 kubenswrapper[4102]: I0312 12:21:12.295664 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.296627 master-0 kubenswrapper[4102]: I0312 12:21:12.296593 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.296825 master-0 kubenswrapper[4102]: I0312 12:21:12.296788 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.296825 master-0 kubenswrapper[4102]: E0312 12:21:12.296817 4102 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:12.296937 master-0 kubenswrapper[4102]: E0312 12:21:12.296877 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.7968609 +0000 UTC m=+117.729637315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:12.297118 master-0 kubenswrapper[4102]: I0312 12:21:12.297020 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.297461 master-0 kubenswrapper[4102]: E0312 12:21:12.297145 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:12.297767 master-0 kubenswrapper[4102]: I0312 12:21:12.297744 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.308915 master-0 kubenswrapper[4102]: I0312 12:21:12.297870 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.309601 master-0 kubenswrapper[4102]: I0312 12:21:12.309427 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.309928 master-0 kubenswrapper[4102]: I0312 12:21:12.309889 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.309982 master-0 kubenswrapper[4102]: I0312 12:21:12.309937 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.309982 master-0 kubenswrapper[4102]: E0312 12:21:12.309969 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:12.797472805 +0000 UTC m=+117.730249220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:12.310082 master-0 kubenswrapper[4102]: I0312 12:21:12.309992 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.310082 master-0 kubenswrapper[4102]: I0312 12:21:12.310046 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.310620 master-0 kubenswrapper[4102]: I0312 12:21:12.310561 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.313075 master-0 kubenswrapper[4102]: I0312 12:21:12.313031 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.314269 master-0 kubenswrapper[4102]: I0312 12:21:12.313792 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.314269 master-0 kubenswrapper[4102]: I0312 12:21:12.313954 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.314269 master-0 kubenswrapper[4102]: I0312 12:21:12.314228 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.316071 master-0 kubenswrapper[4102]: I0312 12:21:12.315317 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.316071 master-0 kubenswrapper[4102]: I0312 12:21:12.315745 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.317941 master-0 kubenswrapper[4102]: I0312 12:21:12.316950 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.319220 master-0 kubenswrapper[4102]: I0312 12:21:12.319121 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.322144 master-0 kubenswrapper[4102]: I0312 12:21:12.322099 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:12.329083 master-0 kubenswrapper[4102]: I0312 12:21:12.329038 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.329903 master-0 kubenswrapper[4102]: I0312 12:21:12.329571 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:12.339300 master-0 kubenswrapper[4102]: I0312 12:21:12.339269 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.343933 master-0 kubenswrapper[4102]: W0312 12:21:12.343838 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54612733_158f_4a92_a1bf_f4a8d653ffaf.slice/crio-e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757 WatchSource:0}: Error finding container e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757: Status 404 returned error can't find the container with id e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757 Mar 12 12:21:12.361202 master-0 kubenswrapper[4102]: I0312 12:21:12.361157 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.377038 master-0 kubenswrapper[4102]: I0312 12:21:12.376989 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.387561 master-0 kubenswrapper[4102]: I0312 12:21:12.387527 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:12.401852 master-0 kubenswrapper[4102]: I0312 12:21:12.401227 4102 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:12.416602 master-0 kubenswrapper[4102]: I0312 12:21:12.416517 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:12.424251 master-0 kubenswrapper[4102]: I0312 12:21:12.423874 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:12.435672 master-0 kubenswrapper[4102]: I0312 12:21:12.432893 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:12.458963 master-0 kubenswrapper[4102]: I0312 12:21:12.457842 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:12.472654 master-0 kubenswrapper[4102]: I0312 12:21:12.472037 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:12.496474 master-0 kubenswrapper[4102]: I0312 12:21:12.496154 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:12.519519 master-0 kubenswrapper[4102]: I0312 12:21:12.515362 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx"] Mar 12 12:21:12.532825 master-0 kubenswrapper[4102]: W0312 12:21:12.532595 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6571f5e5_07ee_4e6c_a8ad_277bc52e35ee.slice/crio-e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c WatchSource:0}: Error finding container e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c: Status 404 returned error can't find the container with id e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c Mar 12 12:21:12.558157 master-0 kubenswrapper[4102]: I0312 12:21:12.555820 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:12.575346 master-0 kubenswrapper[4102]: I0312 12:21:12.574763 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:12.579645 master-0 kubenswrapper[4102]: I0312 12:21:12.578883 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:12.592625 master-0 kubenswrapper[4102]: I0312 12:21:12.592576 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:12.620962 master-0 kubenswrapper[4102]: I0312 12:21:12.620782 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv"] Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.699756 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.699832 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.699854 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.699968 4102 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.699987 4102 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.700014 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.70000106 +0000 UTC m=+118.632777475 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.700084 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700057661 +0000 UTC m=+118.632834116 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.700127 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.700171 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.700223 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.700257 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: I0312 12:21:12.700316 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.700410 4102 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.700434 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700427 +0000 UTC m=+118.633203405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:12.703344 master-0 kubenswrapper[4102]: E0312 12:21:12.700469 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700509 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700502462 +0000 UTC m=+118.633278877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700541 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700559 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700552273 +0000 UTC m=+118.633328688 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700590 4102 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700605 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700600034 +0000 UTC m=+118.633376449 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700634 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700648 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700643645 +0000 UTC m=+118.633420060 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700681 4102 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:12.704008 master-0 kubenswrapper[4102]: E0312 12:21:12.700699 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.700694476 +0000 UTC m=+118.633470881 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:12.742820 master-0 kubenswrapper[4102]: W0312 12:21:12.737815 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ebe5b05_95d6_43ff_95a4_0c9c7ce70326.slice/crio-2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba WatchSource:0}: Error finding container 2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba: Status 404 returned error can't find the container with id 2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba Mar 12 12:21:12.742820 master-0 kubenswrapper[4102]: I0312 12:21:12.739389 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k"] Mar 12 12:21:12.742820 master-0 kubenswrapper[4102]: I0312 12:21:12.741975 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b"] Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: I0312 12:21:12.801791 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: I0312 12:21:12.801927 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: I0312 12:21:12.801952 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: E0312 12:21:12.802106 4102 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: E0312 12:21:12.802172 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.80215727 +0000 UTC m=+118.734933685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: E0312 12:21:12.802534 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: E0312 12:21:12.802609 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.80259186 +0000 UTC m=+118.735368275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: E0312 12:21:12.802650 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:12.806600 master-0 kubenswrapper[4102]: E0312 12:21:12.802681 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:13.802673332 +0000 UTC m=+118.735449747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:12.839454 master-0 kubenswrapper[4102]: I0312 12:21:12.839400 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:12.865504 master-0 kubenswrapper[4102]: I0312 12:21:12.864574 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 12:21:12.880142 master-0 kubenswrapper[4102]: I0312 12:21:12.880002 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf"] Mar 12 12:21:12.885354 master-0 kubenswrapper[4102]: I0312 12:21:12.885299 4102 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 12:21:12.890032 master-0 kubenswrapper[4102]: W0312 12:21:12.889983 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b960fe2_d59e_4ee1_bd9d_455b46753cb9.slice/crio-cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9 WatchSource:0}: Error finding container cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9: Status 404 returned error can't find the container with id cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9 Mar 12 12:21:12.899068 master-0 kubenswrapper[4102]: I0312 12:21:12.899018 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4"] Mar 12 12:21:12.920882 master-0 kubenswrapper[4102]: I0312 12:21:12.918756 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd"] Mar 12 12:21:12.930153 master-0 kubenswrapper[4102]: I0312 12:21:12.930123 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg"] Mar 12 12:21:12.944428 master-0 kubenswrapper[4102]: W0312 12:21:12.944358 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda154f648_b96d_449e_b0f5_ba32266000c2.slice/crio-d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587 WatchSource:0}: Error finding container d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587: Status 404 returned error can't find the container with id d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587 Mar 12 12:21:12.972868 master-0 kubenswrapper[4102]: I0312 12:21:12.971894 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp"] Mar 12 12:21:12.974169 master-0 kubenswrapper[4102]: I0312 12:21:12.974097 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h"] Mar 12 12:21:12.975130 master-0 kubenswrapper[4102]: I0312 12:21:12.975079 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h"] Mar 12 12:21:12.982726 master-0 kubenswrapper[4102]: W0312 12:21:12.982691 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a012d0b_d1a8_4cd3_8b91_b346d0445f24.slice/crio-c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3 WatchSource:0}: Error finding container c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3: Status 404 returned error can't find the container with id c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3 Mar 12 12:21:13.026286 master-0 kubenswrapper[4102]: I0312 12:21:13.026231 4102 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv"] Mar 12 12:21:13.038030 master-0 kubenswrapper[4102]: W0312 12:21:13.037979 4102 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda346ac54_02fe_417f_a49d_038e45b13a1d.slice/crio-745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf WatchSource:0}: Error finding container 745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf: Status 404 returned error can't find the container with id 745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf Mar 12 12:21:13.105437 master-0 kubenswrapper[4102]: I0312 12:21:13.105390 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" event={"ID":"ab087440-bdf2-4e2f-9a5a-434d50a2329a","Type":"ContainerStarted","Data":"c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539"} Mar 12 12:21:13.106762 master-0 kubenswrapper[4102]: I0312 12:21:13.106735 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" event={"ID":"0aeeef2a-f9df-4f87-b985-bd1da94c76c3","Type":"ContainerStarted","Data":"2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f"} Mar 12 12:21:13.107997 master-0 kubenswrapper[4102]: I0312 12:21:13.107974 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" event={"ID":"55bf535c-93ab-4870-a9d2-c02496d71ef0","Type":"ContainerStarted","Data":"8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417"} Mar 12 12:21:13.109036 master-0 kubenswrapper[4102]: I0312 12:21:13.109010 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerStarted","Data":"2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba"} Mar 12 12:21:13.109923 master-0 kubenswrapper[4102]: I0312 12:21:13.109901 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" event={"ID":"b5890f0c-cebe-4788-89f7-27568d875741","Type":"ContainerStarted","Data":"4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1"} Mar 12 12:21:13.110763 master-0 kubenswrapper[4102]: I0312 12:21:13.110748 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" event={"ID":"8a121d0d-d201-446b-97a1-e2414e599f4a","Type":"ContainerStarted","Data":"6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc"} Mar 12 12:21:13.111717 master-0 kubenswrapper[4102]: I0312 12:21:13.111690 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" event={"ID":"a346ac54-02fe-417f-a49d-038e45b13a1d","Type":"ContainerStarted","Data":"745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf"} Mar 12 12:21:13.112725 master-0 kubenswrapper[4102]: I0312 12:21:13.112708 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerStarted","Data":"d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587"} Mar 12 12:21:13.114042 master-0 kubenswrapper[4102]: I0312 12:21:13.113610 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xqmw9" event={"ID":"54612733-158f-4a92-a1bf-f4a8d653ffaf","Type":"ContainerStarted","Data":"e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757"} Mar 12 12:21:13.114644 master-0 kubenswrapper[4102]: I0312 12:21:13.114622 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" event={"ID":"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee","Type":"ContainerStarted","Data":"e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c"} Mar 12 12:21:13.123191 master-0 kubenswrapper[4102]: I0312 12:21:13.123160 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" event={"ID":"ea80247e-b4dd-45dc-8255-6e68508c8480","Type":"ContainerStarted","Data":"9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64"} Mar 12 12:21:13.124221 master-0 kubenswrapper[4102]: I0312 12:21:13.124195 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" event={"ID":"9b960fe2-d59e-4ee1-bd9d-455b46753cb9","Type":"ContainerStarted","Data":"cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9"} Mar 12 12:21:13.125301 master-0 kubenswrapper[4102]: I0312 12:21:13.125280 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" event={"ID":"5a012d0b-d1a8-4cd3-8b91-b346d0445f24","Type":"ContainerStarted","Data":"c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3"} Mar 12 12:21:13.718319 master-0 kubenswrapper[4102]: I0312 12:21:13.718267 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:13.718517 master-0 kubenswrapper[4102]: I0312 12:21:13.718344 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:13.718517 master-0 kubenswrapper[4102]: I0312 12:21:13.718368 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:13.718517 master-0 kubenswrapper[4102]: I0312 12:21:13.718387 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:13.718517 master-0 kubenswrapper[4102]: I0312 12:21:13.718413 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:13.718517 master-0 kubenswrapper[4102]: I0312 12:21:13.718434 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:13.718517 master-0 kubenswrapper[4102]: I0312 12:21:13.718467 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:13.718691 master-0 kubenswrapper[4102]: I0312 12:21:13.718536 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:13.718725 master-0 kubenswrapper[4102]: E0312 12:21:13.718696 4102 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:13.718761 master-0 kubenswrapper[4102]: E0312 12:21:13.718742 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.718728993 +0000 UTC m=+120.651505408 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719669 4102 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719721 4102 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719756 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.719733937 +0000 UTC m=+120.652510352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719793 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.719772668 +0000 UTC m=+120.652549153 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719801 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719820 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.719814849 +0000 UTC m=+120.652591264 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719831 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719848 4102 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719861 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.71984616 +0000 UTC m=+120.652622575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719669 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719875 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.71986968 +0000 UTC m=+120.652646095 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719893 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.719888031 +0000 UTC m=+120.652664446 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719920 4102 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:13.719997 master-0 kubenswrapper[4102]: E0312 12:21:13.719940 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.719935392 +0000 UTC m=+120.652711807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: I0312 12:21:13.820100 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: I0312 12:21:13.820155 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: I0312 12:21:13.820240 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: E0312 12:21:13.820388 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: E0312 12:21:13.820442 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.820425232 +0000 UTC m=+120.753201647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: E0312 12:21:13.820838 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: E0312 12:21:13.820872 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.820862703 +0000 UTC m=+120.753639118 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: E0312 12:21:13.820914 4102 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:13.821007 master-0 kubenswrapper[4102]: E0312 12:21:13.820939 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:15.820931454 +0000 UTC m=+120.753707869 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:13.842538 master-0 kubenswrapper[4102]: I0312 12:21:13.841841 4102 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:13.847040 master-0 kubenswrapper[4102]: I0312 12:21:13.846629 4102 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 12:21:14.132621 master-0 kubenswrapper[4102]: I0312 12:21:14.132515 4102 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" event={"ID":"8a121d0d-d201-446b-97a1-e2414e599f4a","Type":"ContainerStarted","Data":"13d86bfb78c4cbeb08e1a2822a58d3b19158c64d8933a782a2465848cf9de135"} Mar 12 12:21:14.345590 master-0 kubenswrapper[4102]: I0312 12:21:14.345156 4102 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" podStartSLOduration=76.345139187 podStartE2EDuration="1m16.345139187s" podCreationTimestamp="2026-03-12 12:19:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:14.344390019 +0000 UTC m=+119.277166454" watchObservedRunningTime="2026-03-12 12:21:14.345139187 +0000 UTC m=+119.277915602" Mar 12 12:21:15.741748 master-0 kubenswrapper[4102]: I0312 12:21:15.741543 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:15.742155 master-0 kubenswrapper[4102]: I0312 12:21:15.741757 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:15.742155 master-0 kubenswrapper[4102]: I0312 12:21:15.741796 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:15.742155 master-0 kubenswrapper[4102]: I0312 12:21:15.741830 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:15.742155 master-0 kubenswrapper[4102]: E0312 12:21:15.742016 4102 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:15.742155 master-0 kubenswrapper[4102]: E0312 12:21:15.742097 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742076213 +0000 UTC m=+124.674852628 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:15.742155 master-0 kubenswrapper[4102]: I0312 12:21:15.742124 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: E0312 12:21:15.742174 4102 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: E0312 12:21:15.742222 4102 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: E0312 12:21:15.742232 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742215546 +0000 UTC m=+124.674991961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: I0312 12:21:15.742174 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: E0312 12:21:15.742250 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742241657 +0000 UTC m=+124.675018162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: I0312 12:21:15.742270 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:15.742324 master-0 kubenswrapper[4102]: I0312 12:21:15.742308 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:15.742519 master-0 kubenswrapper[4102]: E0312 12:21:15.742382 4102 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:15.742519 master-0 kubenswrapper[4102]: E0312 12:21:15.742404 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742397711 +0000 UTC m=+124.675174126 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:15.742519 master-0 kubenswrapper[4102]: E0312 12:21:15.742428 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:15.742519 master-0 kubenswrapper[4102]: E0312 12:21:15.742495 4102 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:15.742655 master-0 kubenswrapper[4102]: E0312 12:21:15.742519 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:15.742655 master-0 kubenswrapper[4102]: E0312 12:21:15.742434 4102 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:15.742655 master-0 kubenswrapper[4102]: E0312 12:21:15.742525 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742497843 +0000 UTC m=+124.675274308 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:15.742986 master-0 kubenswrapper[4102]: E0312 12:21:15.742576 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742561535 +0000 UTC m=+124.675338010 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:15.742986 master-0 kubenswrapper[4102]: E0312 12:21:15.742889 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742865882 +0000 UTC m=+124.675642347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:15.742986 master-0 kubenswrapper[4102]: E0312 12:21:15.742905 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.742897573 +0000 UTC m=+124.675674098 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:15.842820 master-0 kubenswrapper[4102]: I0312 12:21:15.842756 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:15.843029 master-0 kubenswrapper[4102]: I0312 12:21:15.842845 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:15.843029 master-0 kubenswrapper[4102]: I0312 12:21:15.842865 4102 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:15.843029 master-0 kubenswrapper[4102]: E0312 12:21:15.842993 4102 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:15.843170 master-0 kubenswrapper[4102]: E0312 12:21:15.843037 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.843024764 +0000 UTC m=+124.775801179 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:15.843170 master-0 kubenswrapper[4102]: E0312 12:21:15.843074 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:15.843170 master-0 kubenswrapper[4102]: E0312 12:21:15.843091 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.843086146 +0000 UTC m=+124.775862561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:15.843170 master-0 kubenswrapper[4102]: E0312 12:21:15.843122 4102 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:15.843170 master-0 kubenswrapper[4102]: E0312 12:21:15.843139 4102 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.843134407 +0000 UTC m=+124.775910822 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:17.366644 master-0 kubenswrapper[4102]: I0312 12:21:17.365766 4102 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:21:17.366368 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 12 12:21:17.381581 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 12:21:17.382176 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 12 12:21:17.385524 master-0 systemd[1]: kubelet.service: Consumed 9.884s CPU time. Mar 12 12:21:17.396470 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 12 12:21:17.496034 master-0 kubenswrapper[7320]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:21:17.496708 master-0 kubenswrapper[7320]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 12:21:17.496708 master-0 kubenswrapper[7320]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:21:17.496708 master-0 kubenswrapper[7320]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:21:17.496708 master-0 kubenswrapper[7320]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 12 12:21:17.496708 master-0 kubenswrapper[7320]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:21:17.496708 master-0 kubenswrapper[7320]: I0312 12:21:17.496322 7320 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 12:21:17.499949 master-0 kubenswrapper[7320]: W0312 12:21:17.499913 7320 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:21:17.499949 master-0 kubenswrapper[7320]: W0312 12:21:17.499942 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:21:17.499949 master-0 kubenswrapper[7320]: W0312 12:21:17.499948 7320 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:21:17.499949 master-0 kubenswrapper[7320]: W0312 12:21:17.499955 7320 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.499961 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.499966 7320 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.499972 7320 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.499979 7320 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.499984 7320 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.499989 7320 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.500011 7320 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.500017 7320 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.500022 7320 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.500098 7320 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.500103 7320 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:21:17.500097 master-0 kubenswrapper[7320]: W0312 12:21:17.500107 7320 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500112 7320 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500116 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500120 7320 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500125 7320 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500130 7320 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500134 7320 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500137 7320 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500141 7320 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500146 7320 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500149 7320 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500153 7320 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500156 7320 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500160 7320 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500178 7320 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500183 7320 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500186 7320 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500190 7320 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500194 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500255 7320 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:21:17.500359 master-0 kubenswrapper[7320]: W0312 12:21:17.500259 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500263 7320 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500267 7320 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500270 7320 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500274 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500277 7320 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500281 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500299 7320 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500304 7320 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500309 7320 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500314 7320 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500319 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500323 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500328 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500334 7320 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500339 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500344 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500349 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500354 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:21:17.500870 master-0 kubenswrapper[7320]: W0312 12:21:17.500442 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500453 7320 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500458 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500462 7320 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500467 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500601 7320 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500608 7320 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500611 7320 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500630 7320 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500634 7320 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500638 7320 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500641 7320 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500645 7320 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500648 7320 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500723 7320 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500727 7320 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500731 7320 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: W0312 12:21:17.500734 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: I0312 12:21:17.501024 7320 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: I0312 12:21:17.501110 7320 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: I0312 12:21:17.501121 7320 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 12:21:17.501324 master-0 kubenswrapper[7320]: I0312 12:21:17.501144 7320 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501167 7320 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501174 7320 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501183 7320 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501267 7320 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501274 7320 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501279 7320 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501285 7320 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501292 7320 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501297 7320 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501302 7320 flags.go:64] FLAG: --cgroup-root="" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501308 7320 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501313 7320 flags.go:64] FLAG: --client-ca-file="" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501345 7320 flags.go:64] FLAG: --cloud-config="" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501352 7320 flags.go:64] FLAG: --cloud-provider="" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501357 7320 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501364 7320 flags.go:64] FLAG: --cluster-domain="" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501369 7320 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501375 7320 flags.go:64] FLAG: --config-dir="" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501380 7320 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501387 7320 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501394 7320 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501399 7320 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501423 7320 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 12:21:17.501795 master-0 kubenswrapper[7320]: I0312 12:21:17.501429 7320 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501433 7320 flags.go:64] FLAG: --contention-profiling="false" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501438 7320 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501442 7320 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501446 7320 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501450 7320 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501456 7320 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501460 7320 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501464 7320 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501468 7320 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501494 7320 flags.go:64] FLAG: --enable-server="true" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501499 7320 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501505 7320 flags.go:64] FLAG: --event-burst="100" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501510 7320 flags.go:64] FLAG: --event-qps="50" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501514 7320 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501518 7320 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501522 7320 flags.go:64] FLAG: --eviction-hard="" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501527 7320 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501532 7320 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501536 7320 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501540 7320 flags.go:64] FLAG: --eviction-soft="" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501545 7320 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501549 7320 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501553 7320 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501574 7320 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 12:21:17.502317 master-0 kubenswrapper[7320]: I0312 12:21:17.501579 7320 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501583 7320 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501587 7320 flags.go:64] FLAG: --feature-gates="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501592 7320 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501596 7320 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501601 7320 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501605 7320 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501610 7320 flags.go:64] FLAG: --healthz-port="10248" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501614 7320 flags.go:64] FLAG: --help="false" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501618 7320 flags.go:64] FLAG: --hostname-override="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501622 7320 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501627 7320 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501631 7320 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501652 7320 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501656 7320 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501660 7320 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501665 7320 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501669 7320 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501673 7320 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501677 7320 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501681 7320 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501685 7320 flags.go:64] FLAG: --kube-reserved="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501691 7320 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501695 7320 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501700 7320 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501703 7320 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 12:21:17.502856 master-0 kubenswrapper[7320]: I0312 12:21:17.501708 7320 flags.go:64] FLAG: --lock-file="" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501728 7320 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501733 7320 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501737 7320 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501744 7320 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501748 7320 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501753 7320 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501757 7320 flags.go:64] FLAG: --logging-format="text" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501760 7320 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501765 7320 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501769 7320 flags.go:64] FLAG: --manifest-url="" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501773 7320 flags.go:64] FLAG: --manifest-url-header="" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501779 7320 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501783 7320 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501805 7320 flags.go:64] FLAG: --max-pods="110" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501810 7320 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501814 7320 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501819 7320 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501824 7320 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501830 7320 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501835 7320 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501839 7320 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501855 7320 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501861 7320 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 12:21:17.503506 master-0 kubenswrapper[7320]: I0312 12:21:17.501886 7320 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501895 7320 flags.go:64] FLAG: --pod-cidr="" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501900 7320 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501911 7320 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501918 7320 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501923 7320 flags.go:64] FLAG: --pods-per-core="0" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501928 7320 flags.go:64] FLAG: --port="10250" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501934 7320 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501940 7320 flags.go:64] FLAG: --provider-id="" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501964 7320 flags.go:64] FLAG: --qos-reserved="" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501971 7320 flags.go:64] FLAG: --read-only-port="10255" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501976 7320 flags.go:64] FLAG: --register-node="true" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501980 7320 flags.go:64] FLAG: --register-schedulable="true" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501984 7320 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501992 7320 flags.go:64] FLAG: --registry-burst="10" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.501998 7320 flags.go:64] FLAG: --registry-qps="5" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502001 7320 flags.go:64] FLAG: --reserved-cpus="" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502006 7320 flags.go:64] FLAG: --reserved-memory="" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502011 7320 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502015 7320 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502020 7320 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502024 7320 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502045 7320 flags.go:64] FLAG: --runonce="false" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502050 7320 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502054 7320 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 12:21:17.504025 master-0 kubenswrapper[7320]: I0312 12:21:17.502058 7320 flags.go:64] FLAG: --seccomp-default="false" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502062 7320 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502066 7320 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502070 7320 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502075 7320 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502079 7320 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502083 7320 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502087 7320 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502091 7320 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502095 7320 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502099 7320 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502104 7320 flags.go:64] FLAG: --system-cgroups="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502124 7320 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502131 7320 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502135 7320 flags.go:64] FLAG: --tls-cert-file="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502139 7320 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502144 7320 flags.go:64] FLAG: --tls-min-version="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502148 7320 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502153 7320 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502158 7320 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502161 7320 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502166 7320 flags.go:64] FLAG: --v="2" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502174 7320 flags.go:64] FLAG: --version="false" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502181 7320 flags.go:64] FLAG: --vmodule="" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502208 7320 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 12:21:17.504650 master-0 kubenswrapper[7320]: I0312 12:21:17.502214 7320 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502375 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502384 7320 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502388 7320 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502392 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502396 7320 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502399 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502403 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502407 7320 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502411 7320 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502415 7320 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502418 7320 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502422 7320 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502443 7320 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502448 7320 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502455 7320 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502460 7320 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502464 7320 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502470 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502502 7320 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:21:17.505271 master-0 kubenswrapper[7320]: W0312 12:21:17.502508 7320 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502513 7320 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502518 7320 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502522 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502527 7320 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502532 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502536 7320 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502540 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502544 7320 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502548 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502552 7320 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502555 7320 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502578 7320 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502583 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502587 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502591 7320 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502595 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502599 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502608 7320 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:21:17.505778 master-0 kubenswrapper[7320]: W0312 12:21:17.502613 7320 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502617 7320 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502621 7320 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502626 7320 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502630 7320 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502634 7320 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502654 7320 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502658 7320 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502662 7320 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502666 7320 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502670 7320 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502675 7320 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502680 7320 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502684 7320 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502687 7320 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502691 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502694 7320 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502698 7320 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502701 7320 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502705 7320 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:21:17.506178 master-0 kubenswrapper[7320]: W0312 12:21:17.502708 7320 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502712 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502731 7320 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502735 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502738 7320 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502742 7320 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502745 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502749 7320 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502752 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502756 7320 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502759 7320 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502765 7320 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502769 7320 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: W0312 12:21:17.502774 7320 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:21:17.506655 master-0 kubenswrapper[7320]: I0312 12:21:17.502787 7320 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:21:17.513146 master-0 kubenswrapper[7320]: I0312 12:21:17.513106 7320 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 12 12:21:17.513146 master-0 kubenswrapper[7320]: I0312 12:21:17.513136 7320 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 12:21:17.513237 master-0 kubenswrapper[7320]: W0312 12:21:17.513213 7320 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:21:17.513237 master-0 kubenswrapper[7320]: W0312 12:21:17.513222 7320 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:21:17.513237 master-0 kubenswrapper[7320]: W0312 12:21:17.513226 7320 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:21:17.513237 master-0 kubenswrapper[7320]: W0312 12:21:17.513230 7320 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:21:17.513237 master-0 kubenswrapper[7320]: W0312 12:21:17.513235 7320 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:21:17.513237 master-0 kubenswrapper[7320]: W0312 12:21:17.513239 7320 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513244 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513250 7320 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513256 7320 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513259 7320 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513263 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513267 7320 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513271 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513274 7320 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513278 7320 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513282 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513286 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513290 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513293 7320 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513297 7320 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513300 7320 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513304 7320 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513308 7320 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513312 7320 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:21:17.513381 master-0 kubenswrapper[7320]: W0312 12:21:17.513316 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513320 7320 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513323 7320 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513326 7320 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513330 7320 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513335 7320 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513339 7320 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513342 7320 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513346 7320 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513351 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513355 7320 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513361 7320 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513366 7320 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513370 7320 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513374 7320 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513378 7320 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513382 7320 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513385 7320 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513389 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:21:17.513892 master-0 kubenswrapper[7320]: W0312 12:21:17.513393 7320 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513397 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513401 7320 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513405 7320 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513408 7320 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513412 7320 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513415 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513419 7320 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513422 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513426 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513429 7320 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513433 7320 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513437 7320 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513440 7320 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513444 7320 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513448 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513451 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513455 7320 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513458 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:21:17.514294 master-0 kubenswrapper[7320]: W0312 12:21:17.513463 7320 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513468 7320 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513472 7320 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513491 7320 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513495 7320 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513499 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513503 7320 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513507 7320 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513510 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513514 7320 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: I0312 12:21:17.513520 7320 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513629 7320 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513635 7320 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513639 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513643 7320 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513646 7320 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:21:17.514752 master-0 kubenswrapper[7320]: W0312 12:21:17.513650 7320 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513654 7320 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513658 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513661 7320 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513665 7320 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513668 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513672 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513675 7320 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513678 7320 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513683 7320 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513686 7320 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513690 7320 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513693 7320 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513697 7320 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513700 7320 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513704 7320 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513707 7320 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513712 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513716 7320 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513720 7320 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:21:17.515148 master-0 kubenswrapper[7320]: W0312 12:21:17.513724 7320 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513727 7320 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513731 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513734 7320 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513738 7320 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513742 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513745 7320 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513749 7320 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513752 7320 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513755 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513759 7320 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513763 7320 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513766 7320 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513770 7320 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513774 7320 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513777 7320 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513782 7320 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513786 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513791 7320 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513794 7320 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:21:17.515626 master-0 kubenswrapper[7320]: W0312 12:21:17.513798 7320 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513803 7320 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513807 7320 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513812 7320 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513816 7320 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513821 7320 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513825 7320 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513829 7320 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513833 7320 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513836 7320 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513840 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513843 7320 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513847 7320 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513850 7320 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513854 7320 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513857 7320 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513861 7320 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513864 7320 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513867 7320 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:21:17.516061 master-0 kubenswrapper[7320]: W0312 12:21:17.513871 7320 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513875 7320 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513878 7320 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513882 7320 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513885 7320 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513889 7320 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513892 7320 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: W0312 12:21:17.513896 7320 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: I0312 12:21:17.513903 7320 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: I0312 12:21:17.514050 7320 server.go:940] "Client rotation is on, will bootstrap in background" Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: I0312 12:21:17.515472 7320 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: I0312 12:21:17.515550 7320 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: I0312 12:21:17.515730 7320 server.go:997] "Starting client certificate rotation" Mar 12 12:21:17.516458 master-0 kubenswrapper[7320]: I0312 12:21:17.515740 7320 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 12 12:21:17.516806 master-0 kubenswrapper[7320]: I0312 12:21:17.515988 7320 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 09:31:39.054520167 +0000 UTC Mar 12 12:21:17.516806 master-0 kubenswrapper[7320]: I0312 12:21:17.516027 7320 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h10m21.538495307s for next certificate rotation Mar 12 12:21:17.516806 master-0 kubenswrapper[7320]: I0312 12:21:17.516249 7320 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:21:17.517369 master-0 kubenswrapper[7320]: I0312 12:21:17.517343 7320 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:21:17.522100 master-0 kubenswrapper[7320]: I0312 12:21:17.522067 7320 log.go:25] "Validated CRI v1 runtime API" Mar 12 12:21:17.524960 master-0 kubenswrapper[7320]: I0312 12:21:17.524927 7320 log.go:25] "Validated CRI v1 image API" Mar 12 12:21:17.526011 master-0 kubenswrapper[7320]: I0312 12:21:17.525983 7320 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 12:21:17.529895 master-0 kubenswrapper[7320]: I0312 12:21:17.529842 7320 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 f1cf7764-854b-4c2c-9df4-b92427278cd1:/dev/vda3] Mar 12 12:21:17.530158 master-0 kubenswrapper[7320]: I0312 12:21:17.529877 7320 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a/userdata/shm major:0 minor:134 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba/userdata/shm major:0 minor:257 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81/userdata/shm major:0 minor:103 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc/userdata/shm major:0 minor:274 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e/userdata/shm major:0 minor:126 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c/userdata/shm major:0 minor:245 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b/userdata/shm major:0 minor:143 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~projected/kube-api-access-m9gmt:{mountpoint:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~projected/kube-api-access-m9gmt major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~secret/serving-cert major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10498208-0692-4533-b672-a7a2cfcdf1be/volumes/kubernetes.io~projected/kube-api-access-xdwfl:{mountpoint:/var/lib/kubelet/pods/10498208-0692-4533-b672-a7a2cfcdf1be/volumes/kubernetes.io~projected/kube-api-access-xdwfl major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~projected/kube-api-access-b2f6r:{mountpoint:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~projected/kube-api-access-b2f6r major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~projected/kube-api-access-p97xk:{mountpoint:/var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~projected/kube-api-access-p97xk major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~projected/kube-api-access-xx2c4:{mountpoint:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~projected/kube-api-access-xx2c4 major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~projected/kube-api-access-wg27g:{mountpoint:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~projected/kube-api-access-wg27g major:0 minor:142 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~secret/webhook-cert major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54612733-158f-4a92-a1bf-f4a8d653ffaf/volumes/kubernetes.io~projected/kube-api-access-9lnbq:{mountpoint:/var/lib/kubelet/pods/54612733-158f-4a92-a1bf-f4a8d653ffaf/volumes/kubernetes.io~projected/kube-api-access-9lnbq major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~projected/kube-api-access-svpvs:{mountpoint:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~projected/kube-api-access-svpvs major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a012d0b-d1a8-4cd3-8b91-b346d0445f24/volumes/kubernetes.io~projected/kube-api-access-9bx48:{mountpoint:/var/lib/kubelet/pods/5a012d0b-d1a8-4cd3-8b91-b346d0445f24/volumes/kubernetes.io~projected/kube-api-access-9bx48 major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~projected/kube-api-access-kv9fk:{mountpoint:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~projected/kube-api-access-kv9fk major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~secret/metrics-tls major:0 minor:101 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~projected/kube-api-access major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/666857a1-0ddf-4b48-91f4-44cce154d1b1/volumes/kubernetes.io~projected/kube-api-access-vrqx7:{mountpoint:/var/lib/kubelet/pods/666857a1-0ddf-4b48-91f4-44cce154d1b1/volumes/kubernetes.io~projected/kube-api-access-vrqx7 major:0 minor:92 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~projected/kube-api-access-jcltq:{mountpoint:/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~projected/kube-api-access-jcltq major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~secret/serving-cert major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8e6f7496-1047-482d-9203-ff83a9eb7d93/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/8e6f7496-1047-482d-9203-ff83a9eb7d93/volumes/kubernetes.io~projected/kube-api-access major:0 minor:107 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~projected/kube-api-access major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~secret/serving-cert major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~projected/kube-api-access-xmlzw:{mountpoint:/var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~projected/kube-api-access-xmlzw major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~projected/kube-api-access-x8mvz:{mountpoint:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~projected/kube-api-access-x8mvz major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/kube-api-access-ft5sd:{mountpoint:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/kube-api-access-ft5sd major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~projected/kube-api-access-9dhfq:{mountpoint:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~projected/kube-api-access-9dhfq major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~secret/serving-cert major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~projected/kube-api-access-pwfct:{mountpoint:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~projected/kube-api-access-pwfct major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~projected/kube-api-access-6l57v:{mountpoint:/var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~projected/kube-api-access-6l57v major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~projected/kube-api-access-fqcrz:{mountpoint:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~projected/kube-api-access-fqcrz major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~secret/serving-cert major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~projected/kube-api-access-s4r7v:{mountpoint:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~projected/kube-api-access-s4r7v major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/kube-api-access-r5xlx:{mountpoint:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/kube-api-access-r5xlx major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~projected/kube-api-access-jq9d5:{mountpoint:/var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~projected/kube-api-access-jq9d5 major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~projected/kube-api-access-zdvf6:{mountpoint:/var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~projected/kube-api-access-zdvf6 major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~projected/kube-api-access-tq5c7:{mountpoint:/var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~projected/kube-api-access-tq5c7 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~projected/kube-api-access-xbztv:{mountpoint:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~projected/kube-api-access-xbztv major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~projected/kube-api-access-gd645:{mountpoint:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~projected/kube-api-access-gd645 major:0 minor:133 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:132 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~projected/kube-api-access-6flvz:{mountpoint:/var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~projected/kube-api-access-6flvz major:0 minor:219 fsType:tmpfs blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/6918e18f41709fa6650e2480e0000edf9b35beb3f771ce3fb624c8d3824ab074/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/51245bbd57c720378576e3856f5b8b6ad38102e4de5bda74be69158ffc8bc105/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/1a23ca58d99cfb96c0a7416d1fcf8c85e5759214533a5f47defdfac9c30c53f3/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/9f44eb1099f6e7ca6b2919c25fe6b16f301625ab35847f6535e8a0bed60948b6/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-128:{mountpoint:/var/lib/containers/storage/overlay/0227392e62943a5421d5117a45613bb529ec58a6b4e1020c6c4724f64bfeb49b/merged major:0 minor:128 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/de56fda083a3a46b5d9dfe72d06ed1a6f006d2faea2846b62646a3c68dc246fe/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-139:{mountpoint:/var/lib/containers/storage/overlay/1de229b94ac4c87df95373ce1d4b224ba337afd7df3c6e2863683e588756d225/merged major:0 minor:139 fsType:overlay blockSize:0} overlay_0-145:{mountpoint:/var/lib/containers/storage/overlay/0e5100b29e5eb59624ed266a476b13b49716cd035088507dca49a2efb851c004/merged major:0 minor:145 fsType:overlay blockSize:0} overlay_0-147:{mountpoint:/var/lib/containers/storage/overlay/123db14c086e03a6fa2a93cd350da3c7d3f6bad3562d41fb9a648bd89953d36e/merged major:0 minor:147 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/5671824a14c759f74bac5e3e4d4eda71da1e428a35c6a1d759cd3f101ac39dd4/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/f99c96a5d52735f372e2f64dc3e047205afbea3602c688881c03774572b374ad/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/4af4fd105f0486945a61f448fbeb63eaf7b1616df39d1edb0936e6849cfc1bcc/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/e77186afdf459bdc81ad0fbf9f8d84fb80abcdefef820e33c453f65ab9511bfd/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/732ffaaaeed58a2af6625eb1cba1ad5af20d11ed9e1cb2c36fdf3dacc399b752/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/f01dadb2943df2cdd191d4eb77d10fc8d66487e339539769133e56d239134687/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/14455f71ceb3566330a851bdd9c0beba40fd46e8b1a3114d513bef5cd9c7697d/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/8147fd49d7060556f1a6d7ed71b7be78090cee700273c0a3331008c154f3317a/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/e6c234ffa99a8fbfee78f7c28394d10b4f69637de07ad9450a1130477f7178b0/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/b93afabc703f079e793f8c205c4d7498282b6adc4b6926f0310613f4335fa7d9/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/8da7d23860429d271aadedb88a691fe8c0acd58cd6632612c95032b9cf8d7582/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/2d7aab61a22bec48a13bbbf12c4375b33a2b6cb4ce4ba67fe69048ec0ff7e039/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-250:{mountpoint:/var/lib/containers/storage/overlay/d1934d6434bb66e17ad49f3f3037da293f0464476f24037ea944c55e783f9aa6/merged major:0 minor:250 fsType:overlay blockSize:0} overlay_0-269:{mountpoint:/var/lib/containers/storage/overlay/7c69419df66553e83b8bc70bd47f65c61c4b96e95005d7d20c25d1dc33031029/merged major:0 minor:269 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/019d4165db29f7ead30fd39128bf626db5ad5165f5130442249ebe567fc975bf/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/eb487e733038a49d042ad434ccc6ecf82eb467eb91b2eab81747168b2bb31db6/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/1ed9e727adc96ddf7693cc5d755fd8bf0f8dd50d4e80c21b660558df9055ee06/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/5877efa5f57baf9890204c4bb74d7c4023b643b83f62ef3151c90d81295ebb42/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/dd0828333b7e61e4f647add8c3cfdbf06d9b31077977717f0d765cb044fc1cf1/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/c86702e99dd3c268bbe7182864e6ac52f44cc502e849adbdcb62b1211b133709/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/72045d9d16c8d4e408a28ab3fd7f16afcbe80618d9761ac94ad62e22f2aeaf78/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/40f5fc8b9d4d34715d9ba18873dccbc16b47a4f4fe4218c9aeca37d5187c269f/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/43a25658cd2c258fda0386101fe97332398cc34add2d7f4d6b4b39c126524a10/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/cb7a9227e97c2d237f018ec73f0a458c58aa44cb71edf741e0cbd0e71409cd14/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/16ae3329f6bf42aac4b1fba0b90ace71b79e779f37b5a17ad7706f45a12bce05/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/fc953aaad4925eaf9eb3ae2e191f68666d9a369f6663b747ad38ccd49f1f0704/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/b0a0865328380b98d15d56326f4948b9b2db57aa4825a30611f4103fa370364a/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/74dcc27038dab84c831d6a2b1d09f0302211024d763dd3c18ae2b24726d50f9f/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/93c0f3ee6e3fbaf3d41ef1380eef81803f5a034e5c302d337da2eb7fd8dc131b/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/6dc0722f44a0ed003a7f5d0ef9cd4b86cce5f069948d9451a148056ff4f29c32/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/6ccaa384cb89c5e36d1b38242e7b182395e30b3129e51672b97b7e210b48f604/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/803068f769bc478cad74c1fca6fce898155e16a04ef05dac778eead35759113a/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/bdef3aa1964cb475c44bbe628ca669e8ddaad524a330e317e92d36fea6a1f6b1/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/743bf3e7417e09e3b5ddebe413e292f8fdebd4545924dbdc41d08fa1bf5a117c/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-66:{mountpoint:/var/lib/containers/storage/overlay/d099963c7da3b02825404ec82640082c3c91ca0d24043663cfbb1eeae030d949/merged major:0 minor:66 fsType:overlay blockSize:0} overlay_0-68:{mountpoint:/var/lib/containers/storage/overlay/e4ccf8d841968d3a931d24210f6be0fec71a0d119c3fefece1c4a375341aae7b/merged major:0 minor:68 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/509b075d54906caba7d3cf821d5b3cb4323a3c665df834a33d2909fa45b546cd/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/e94e83ee0575e12f6d97d33f30c9a30205196d327ccfddc564f48fef9307eac1/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/79cc45b87333ca14ca57e5a26bcc01712963db11462343afcc7104e8c46e8463/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 12 12:21:17.566003 master-0 kubenswrapper[7320]: I0312 12:21:17.564015 7320 manager.go:217] Machine: {Timestamp:2026-03-12 12:21:17.560648311 +0000 UTC m=+0.119692212 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:2fe2bf4a51c343709ee1f99d3a96e3ea SystemUUID:2fe2bf4a-51c3-4370-9ee1-f99d3a96e3ea BootID:473a10fb-d3cc-4f1f-a79c-240b5ee16b09 Filesystems:[{Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:132 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~projected/kube-api-access-p97xk DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c/userdata/shm DeviceMajor:0 DeviceMinor:245 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:252 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-68 DeviceMajor:0 DeviceMinor:68 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-147 DeviceMajor:0 DeviceMinor:147 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/kube-api-access-r5xlx DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc/userdata/shm DeviceMajor:0 DeviceMinor:274 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81/userdata/shm DeviceMajor:0 DeviceMinor:103 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~projected/kube-api-access-wg27g DeviceMajor:0 DeviceMinor:142 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~projected/kube-api-access-s4r7v DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:141 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~projected/kube-api-access-pwfct DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~projected/kube-api-access-6flvz DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~projected/kube-api-access-jcltq DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba/userdata/shm DeviceMajor:0 DeviceMinor:257 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~projected/kube-api-access-b2f6r DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-128 DeviceMajor:0 DeviceMinor:128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-145 DeviceMajor:0 DeviceMinor:145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-66 DeviceMajor:0 DeviceMinor:66 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/666857a1-0ddf-4b48-91f4-44cce154d1b1/volumes/kubernetes.io~projected/kube-api-access-vrqx7 DeviceMajor:0 DeviceMinor:92 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~projected/kube-api-access-gd645 DeviceMajor:0 DeviceMinor:133 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~projected/kube-api-access-fqcrz DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~projected/kube-api-access-jq9d5 DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/kube-api-access-ft5sd DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~projected/kube-api-access-tq5c7 DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~projected/kube-api-access-xmlzw DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~projected/kube-api-access-svpvs DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-250 DeviceMajor:0 DeviceMinor:250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:101 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~projected/kube-api-access-m9gmt DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~projected/kube-api-access-x8mvz DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a/userdata/shm DeviceMajor:0 DeviceMinor:134 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-139 DeviceMajor:0 DeviceMinor:139 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~projected/kube-api-access-xx2c4 DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~projected/kube-api-access-9dhfq DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/10498208-0692-4533-b672-a7a2cfcdf1be/volumes/kubernetes.io~projected/kube-api-access-xdwfl DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~projected/kube-api-access-6l57v DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/54612733-158f-4a92-a1bf-f4a8d653ffaf/volumes/kubernetes.io~projected/kube-api-access-9lnbq DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-269 DeviceMajor:0 DeviceMinor:269 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~projected/kube-api-access-kv9fk DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~projected/kube-api-access-xbztv DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e/userdata/shm DeviceMajor:0 DeviceMinor:126 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b/userdata/shm DeviceMajor:0 DeviceMinor:143 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~projected/kube-api-access-zdvf6 DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5a012d0b-d1a8-4cd3-8b91-b346d0445f24/volumes/kubernetes.io~projected/kube-api-access-9bx48 DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8e6f7496-1047-482d-9203-ff83a9eb7d93/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:107 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:2860b265a556fe9 MacAddress:e6:c1:67:b2:79:ea Speed:10000 Mtu:8900} {Name:2d15b9b0e60b6b3 MacAddress:ce:c3:f0:12:75:db Speed:10000 Mtu:8900} {Name:4300be3c5fe59df MacAddress:ce:d8:bb:7c:27:31 Speed:10000 Mtu:8900} {Name:6519636251c6c38 MacAddress:82:37:e8:5f:c7:a3 Speed:10000 Mtu:8900} {Name:745f17315daebf9 MacAddress:1e:9b:73:a4:06:32 Speed:10000 Mtu:8900} {Name:8a760fef0730108 MacAddress:de:84:a2:04:65:37 Speed:10000 Mtu:8900} {Name:9576089e2647ba9 MacAddress:3e:7f:04:69:1b:62 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:02:31:d3:8a:3a:c8 Speed:0 Mtu:8900} {Name:c4db6563159c41f MacAddress:46:80:36:f7:ae:0a Speed:10000 Mtu:8900} {Name:c999be46e4c7dd1 MacAddress:ba:8b:5e:14:c9:5c Speed:10000 Mtu:8900} {Name:cebeefbf62ca940 MacAddress:76:0c:14:65:20:6e Speed:10000 Mtu:8900} {Name:d9d20b5228d0e4a MacAddress:6e:7a:6f:35:0e:6c Speed:10000 Mtu:8900} {Name:e19aac61221800c MacAddress:2e:cd:23:30:20:1f Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:30:db:e2 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:e7:70:da Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:82:08:64:40:9a:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 12:21:17.566439 master-0 kubenswrapper[7320]: I0312 12:21:17.566068 7320 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 12:21:17.566439 master-0 kubenswrapper[7320]: I0312 12:21:17.566284 7320 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 12:21:17.567155 master-0 kubenswrapper[7320]: I0312 12:21:17.567125 7320 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 12 12:21:17.567535 master-0 kubenswrapper[7320]: I0312 12:21:17.567384 7320 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 12:21:17.568146 master-0 kubenswrapper[7320]: I0312 12:21:17.567530 7320 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 12:21:17.568222 master-0 kubenswrapper[7320]: I0312 12:21:17.568160 7320 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 12:21:17.568325 master-0 kubenswrapper[7320]: I0312 12:21:17.568291 7320 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 12:21:17.568325 master-0 kubenswrapper[7320]: I0312 12:21:17.568308 7320 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 12:21:17.568454 master-0 kubenswrapper[7320]: I0312 12:21:17.568430 7320 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 12:21:17.568859 master-0 kubenswrapper[7320]: I0312 12:21:17.568778 7320 state_mem.go:36] "Initialized new in-memory state store" Mar 12 12:21:17.569308 master-0 kubenswrapper[7320]: I0312 12:21:17.569275 7320 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 12 12:21:17.569428 master-0 kubenswrapper[7320]: I0312 12:21:17.569406 7320 kubelet.go:418] "Attempting to sync node with API server" Mar 12 12:21:17.570467 master-0 kubenswrapper[7320]: I0312 12:21:17.569606 7320 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 12:21:17.571006 master-0 kubenswrapper[7320]: I0312 12:21:17.570964 7320 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 12:21:17.571006 master-0 kubenswrapper[7320]: I0312 12:21:17.571002 7320 kubelet.go:324] "Adding apiserver pod source" Mar 12 12:21:17.571082 master-0 kubenswrapper[7320]: I0312 12:21:17.571017 7320 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 12:21:17.572764 master-0 kubenswrapper[7320]: I0312 12:21:17.572709 7320 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 12 12:21:17.573024 master-0 kubenswrapper[7320]: I0312 12:21:17.572995 7320 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 12 12:21:17.573340 master-0 kubenswrapper[7320]: I0312 12:21:17.573309 7320 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 12 12:21:17.573502 master-0 kubenswrapper[7320]: I0312 12:21:17.573459 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 12:21:17.573502 master-0 kubenswrapper[7320]: I0312 12:21:17.573500 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573508 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573515 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573521 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573528 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573534 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573540 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573548 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573553 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 12:21:17.573574 master-0 kubenswrapper[7320]: I0312 12:21:17.573579 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 12:21:17.573788 master-0 kubenswrapper[7320]: I0312 12:21:17.573592 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 12:21:17.573788 master-0 kubenswrapper[7320]: I0312 12:21:17.573619 7320 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 12:21:17.574300 master-0 kubenswrapper[7320]: I0312 12:21:17.574246 7320 server.go:1280] "Started kubelet" Mar 12 12:21:17.575140 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 12 12:21:17.575513 master-0 kubenswrapper[7320]: I0312 12:21:17.575428 7320 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 12:21:17.575553 master-0 kubenswrapper[7320]: I0312 12:21:17.575533 7320 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 12 12:21:17.576028 master-0 kubenswrapper[7320]: I0312 12:21:17.575998 7320 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 12:21:17.576088 master-0 kubenswrapper[7320]: I0312 12:21:17.576061 7320 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 12:21:17.586176 master-0 kubenswrapper[7320]: I0312 12:21:17.586102 7320 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 12:21:17.586266 master-0 kubenswrapper[7320]: I0312 12:21:17.586201 7320 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 12:21:17.586432 master-0 kubenswrapper[7320]: I0312 12:21:17.586390 7320 server.go:449] "Adding debug handlers to kubelet server" Mar 12 12:21:17.587615 master-0 kubenswrapper[7320]: I0312 12:21:17.587569 7320 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 12 12:21:17.587704 master-0 kubenswrapper[7320]: I0312 12:21:17.587647 7320 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 12:21:17.587704 master-0 kubenswrapper[7320]: I0312 12:21:17.587666 7320 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 05:52:41.986314165 +0000 UTC Mar 12 12:21:17.588170 master-0 kubenswrapper[7320]: I0312 12:21:17.587712 7320 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 12 12:21:17.588170 master-0 kubenswrapper[7320]: I0312 12:21:17.587733 7320 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 12 12:21:17.588170 master-0 kubenswrapper[7320]: I0312 12:21:17.587839 7320 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 12 12:21:17.593977 master-0 kubenswrapper[7320]: I0312 12:21:17.593909 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap" seLinuxMountContext="" Mar 12 12:21:17.593977 master-0 kubenswrapper[7320]: I0312 12:21:17.593965 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" volumeName="kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.593996 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594005 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594016 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594024 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594033 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594041 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594070 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9194868-75ce-4138-a9d4-ddd64660c529" volumeName="kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594079 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594087 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a012d0b-d1a8-4cd3-8b91-b346d0445f24" volumeName="kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594096 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" volumeName="kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594105 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle" seLinuxMountContext="" Mar 12 12:21:17.594110 master-0 kubenswrapper[7320]: I0312 12:21:17.594115 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594143 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594155 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e6f7496-1047-482d-9203-ff83a9eb7d93" volumeName="kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594165 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594173 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594182 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594191 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c02552c-a477-4c6c-8a45-2fdc758c084b" volumeName="kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594200 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55bf535c-93ab-4870-a9d2-c02496d71ef0" volumeName="kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594228 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" volumeName="kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594236 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a121d0d-d201-446b-97a1-e2414e599f4a" volumeName="kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594244 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae2269d7-f11f-46d1-95e7-f89a70ee1152" volumeName="kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594252 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea80247e-b4dd-45dc-8255-6e68508c8480" volumeName="kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594271 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594298 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" volumeName="kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594311 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54612733-158f-4a92-a1bf-f4a8d653ffaf" volumeName="kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594320 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594328 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a121d0d-d201-446b-97a1-e2414e599f4a" volumeName="kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594337 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594345 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" volumeName="kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594353 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594380 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" volumeName="kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594390 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61ab511b-72e9-4fb9-b5de-770f49514369" volumeName="kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594398 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666857a1-0ddf-4b48-91f4-44cce154d1b1" volumeName="kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594407 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666857a1-0ddf-4b48-91f4-44cce154d1b1" volumeName="kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594416 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594426 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" volumeName="kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access" seLinuxMountContext="" Mar 12 12:21:17.594451 master-0 kubenswrapper[7320]: I0312 12:21:17.594456 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a154f648-b96d-449e-b0f5-ba32266000c2" volumeName="kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594472 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594509 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1d16bbc-778b-4fc1-abb2-b43e79a7c532" volumeName="kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594524 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e64bc838-280e-4231-9732-1adb69fed0bc" volumeName="kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594536 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d961a5f0-84b7-47d7-846b-238475947121" volumeName="kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594579 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54612733-158f-4a92-a1bf-f4a8d653ffaf" volumeName="kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594594 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74d06933-afab-43a3-a1d3-88a569178d34" volumeName="kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594607 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a154f648-b96d-449e-b0f5-ba32266000c2" volumeName="kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594619 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594630 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594665 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5890f0c-cebe-4788-89f7-27568d875741" volumeName="kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594676 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594685 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c02552c-a477-4c6c-8a45-2fdc758c084b" volumeName="kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594697 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61ab511b-72e9-4fb9-b5de-770f49514369" volumeName="kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594706 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" volumeName="kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594716 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5890f0c-cebe-4788-89f7-27568d875741" volumeName="kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594746 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9194868-75ce-4138-a9d4-ddd64660c529" volumeName="kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594756 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594767 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" volumeName="kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594775 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" volumeName="kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594783 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594792 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594819 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594829 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666857a1-0ddf-4b48-91f4-44cce154d1b1" volumeName="kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594840 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8e6f7496-1047-482d-9203-ff83a9eb7d93" volumeName="kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594850 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594859 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea80247e-b4dd-45dc-8255-6e68508c8480" volumeName="kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594868 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" volumeName="kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594893 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55bf535c-93ab-4870-a9d2-c02496d71ef0" volumeName="kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594919 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a121d0d-d201-446b-97a1-e2414e599f4a" volumeName="kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594931 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9bc7dea3-1868-488c-a34b-288cde3acd35" volumeName="kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594940 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594948 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5890f0c-cebe-4788-89f7-27568d875741" volumeName="kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594980 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.594991 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" volumeName="kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595001 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595009 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a154f648-b96d-449e-b0f5-ba32266000c2" volumeName="kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595018 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595027 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae2269d7-f11f-46d1-95e7-f89a70ee1152" volumeName="kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595053 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea80247e-b4dd-45dc-8255-6e68508c8480" volumeName="kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595065 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595075 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55bf535c-93ab-4870-a9d2-c02496d71ef0" volumeName="kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595083 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595092 7320 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3f295ac-7bc7-43b7-bd30-db82e7f16cd7" volumeName="kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz" seLinuxMountContext="" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595100 7320 reconstruct.go:97] "Volume reconstruction finished" Mar 12 12:21:17.604351 master-0 kubenswrapper[7320]: I0312 12:21:17.595106 7320 reconciler.go:26] "Reconciler: start to sync state" Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.587710 7320 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h31m24.398607375s for next certificate rotation Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.605653 7320 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.605684 7320 factory.go:55] Registering systemd factory Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.605714 7320 factory.go:221] Registration of the systemd container factory successfully Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.606012 7320 factory.go:153] Registering CRI-O factory Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.606041 7320 factory.go:221] Registration of the crio container factory successfully Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.606139 7320 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.606161 7320 factory.go:103] Registering Raw factory Mar 12 12:21:17.606392 master-0 kubenswrapper[7320]: I0312 12:21:17.606207 7320 manager.go:1196] Started watching for new ooms in manager Mar 12 12:21:17.607395 master-0 kubenswrapper[7320]: I0312 12:21:17.607359 7320 manager.go:319] Starting recovery of all containers Mar 12 12:21:17.751226 master-0 kubenswrapper[7320]: I0312 12:21:17.751076 7320 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 12 12:21:17.752997 master-0 kubenswrapper[7320]: I0312 12:21:17.752961 7320 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 12 12:21:17.753090 master-0 kubenswrapper[7320]: I0312 12:21:17.753020 7320 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 12 12:21:17.753090 master-0 kubenswrapper[7320]: I0312 12:21:17.753057 7320 kubelet.go:2335] "Starting kubelet main sync loop" Mar 12 12:21:17.753168 master-0 kubenswrapper[7320]: E0312 12:21:17.753131 7320 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 12:21:17.755916 master-0 kubenswrapper[7320]: I0312 12:21:17.755882 7320 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 12:21:17.756077 master-0 kubenswrapper[7320]: I0312 12:21:17.756056 7320 manager.go:324] Recovery completed Mar 12 12:21:17.761816 master-0 kubenswrapper[7320]: I0312 12:21:17.761725 7320 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03" exitCode=0 Mar 12 12:21:17.764720 master-0 kubenswrapper[7320]: I0312 12:21:17.764697 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 12 12:21:17.765154 master-0 kubenswrapper[7320]: I0312 12:21:17.765134 7320 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27" exitCode=1 Mar 12 12:21:17.765154 master-0 kubenswrapper[7320]: I0312 12:21:17.765151 7320 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="db3181f1b8f0872f0e0be3a238121d5924cf03894cfb4adac07406dd2be14404" exitCode=0 Mar 12 12:21:17.766762 master-0 kubenswrapper[7320]: I0312 12:21:17.766743 7320 generic.go:334] "Generic (PLEG): container finished" podID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerID="d6ff10b313edaeab618ba4bd948891faf24292c7fee20c8b60ece8104bb06b3a" exitCode=0 Mar 12 12:21:17.778162 master-0 kubenswrapper[7320]: I0312 12:21:17.778132 7320 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="609bc41bd5850647fabc2e01d12345f9b41d6cc4ea84bcb7679ae4b6d13d442e" exitCode=0 Mar 12 12:21:17.778162 master-0 kubenswrapper[7320]: I0312 12:21:17.778158 7320 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="fb3151f0498b4d271395613cdb5a66c2bdbd18c371f71b100988d9a1524ba2df" exitCode=0 Mar 12 12:21:17.778162 master-0 kubenswrapper[7320]: I0312 12:21:17.778170 7320 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="ea68604b446ee8cffe24318b7151377c7b04f157b8ea561b83368baecd127158" exitCode=0 Mar 12 12:21:17.778330 master-0 kubenswrapper[7320]: I0312 12:21:17.778180 7320 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="ef1ee7f5f63359043faa6c46c732e5167be16d3211712dbb5e513926f5b91304" exitCode=0 Mar 12 12:21:17.778330 master-0 kubenswrapper[7320]: I0312 12:21:17.778193 7320 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="9b70d60ef3db50f988d7297005c87ae9142093113f8ee25c0d2a4d1f3023050e" exitCode=0 Mar 12 12:21:17.778330 master-0 kubenswrapper[7320]: I0312 12:21:17.778203 7320 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="05ebbc8f6ffb604ea4cb658572a6553a226fea47b2132dba48a9b8a612eeb8a1" exitCode=0 Mar 12 12:21:17.786216 master-0 kubenswrapper[7320]: I0312 12:21:17.786164 7320 generic.go:334] "Generic (PLEG): container finished" podID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerID="823aef6fae9a1e0ac9ce3e87b09c4b094495f36691d261ce34a1a0d40c54755e" exitCode=0 Mar 12 12:21:17.798868 master-0 kubenswrapper[7320]: I0312 12:21:17.791613 7320 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 12:21:17.848316 master-0 kubenswrapper[7320]: I0312 12:21:17.848252 7320 generic.go:334] "Generic (PLEG): container finished" podID="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" containerID="0f612f745cd190b544d29d6db2a18fd47dfd753a5c71d5de2a3d0a726aefe224" exitCode=0 Mar 12 12:21:17.853495 master-0 kubenswrapper[7320]: E0312 12:21:17.853430 7320 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 12 12:21:17.855812 master-0 kubenswrapper[7320]: I0312 12:21:17.855777 7320 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 12 12:21:17.855812 master-0 kubenswrapper[7320]: I0312 12:21:17.855797 7320 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 12:21:17.855812 master-0 kubenswrapper[7320]: I0312 12:21:17.855816 7320 state_mem.go:36] "Initialized new in-memory state store" Mar 12 12:21:17.855976 master-0 kubenswrapper[7320]: I0312 12:21:17.855953 7320 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 12:21:17.856007 master-0 kubenswrapper[7320]: I0312 12:21:17.855968 7320 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 12:21:17.856007 master-0 kubenswrapper[7320]: I0312 12:21:17.855997 7320 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 12 12:21:17.856007 master-0 kubenswrapper[7320]: I0312 12:21:17.856006 7320 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 12 12:21:17.856115 master-0 kubenswrapper[7320]: I0312 12:21:17.856014 7320 policy_none.go:49] "None policy: Start" Mar 12 12:21:17.859068 master-0 kubenswrapper[7320]: I0312 12:21:17.859041 7320 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 12 12:21:17.859068 master-0 kubenswrapper[7320]: I0312 12:21:17.859068 7320 state_mem.go:35] "Initializing new in-memory state store" Mar 12 12:21:17.859330 master-0 kubenswrapper[7320]: I0312 12:21:17.859307 7320 state_mem.go:75] "Updated machine memory state" Mar 12 12:21:17.859330 master-0 kubenswrapper[7320]: I0312 12:21:17.859328 7320 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 12 12:21:17.871216 master-0 kubenswrapper[7320]: I0312 12:21:17.871184 7320 manager.go:334] "Starting Device Plugin manager" Mar 12 12:21:17.871416 master-0 kubenswrapper[7320]: I0312 12:21:17.871289 7320 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 12 12:21:17.871416 master-0 kubenswrapper[7320]: I0312 12:21:17.871300 7320 server.go:79] "Starting device plugin registration server" Mar 12 12:21:17.871936 master-0 kubenswrapper[7320]: I0312 12:21:17.871920 7320 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 12:21:17.871984 master-0 kubenswrapper[7320]: I0312 12:21:17.871943 7320 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 12:21:17.872359 master-0 kubenswrapper[7320]: I0312 12:21:17.872341 7320 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 12:21:17.872572 master-0 kubenswrapper[7320]: I0312 12:21:17.872409 7320 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 12:21:17.872572 master-0 kubenswrapper[7320]: I0312 12:21:17.872415 7320 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 12:21:17.972417 master-0 kubenswrapper[7320]: I0312 12:21:17.972348 7320 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:21:17.975123 master-0 kubenswrapper[7320]: I0312 12:21:17.975107 7320 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:21:17.975187 master-0 kubenswrapper[7320]: I0312 12:21:17.975135 7320 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:21:17.975187 master-0 kubenswrapper[7320]: I0312 12:21:17.975167 7320 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:21:17.975236 master-0 kubenswrapper[7320]: I0312 12:21:17.975217 7320 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:21:17.982390 master-0 kubenswrapper[7320]: I0312 12:21:17.982070 7320 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 12 12:21:17.982390 master-0 kubenswrapper[7320]: I0312 12:21:17.982134 7320 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 12 12:21:18.054121 master-0 kubenswrapper[7320]: I0312 12:21:18.053992 7320 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0"] Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054339 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054404 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054416 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a732af73f8df5350fffca3678cce89972e48b40d0d2aaf3a11513ec460452d56"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054426 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054438 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"db3181f1b8f0872f0e0be3a238121d5924cf03894cfb4adac07406dd2be14404"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054448 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054463 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506" Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054523 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2" Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054533 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"96e48ce640071ce1e3c5822f8f356a843319ddfef771dec6cce93f02b2946dac"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054545 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"7832f5ffa2c5ed0b1534228e4894dd1d7b32cd0726e9bdedd6ffb73456947fa0"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054554 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054565 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054675 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054699 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054710 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6"} Mar 12 12:21:18.054972 master-0 kubenswrapper[7320]: I0312 12:21:18.054745 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"40e0db45ceb59150b193a13f31ec145076b5d2cdaee765b94be9609189ebe6e3"} Mar 12 12:21:18.067940 master-0 kubenswrapper[7320]: E0312 12:21:18.067890 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.071566 master-0 kubenswrapper[7320]: W0312 12:21:18.070705 7320 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 12 12:21:18.071566 master-0 kubenswrapper[7320]: E0312 12:21:18.070755 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.071566 master-0 kubenswrapper[7320]: E0312 12:21:18.071308 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.071566 master-0 kubenswrapper[7320]: E0312 12:21:18.071413 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.071566 master-0 kubenswrapper[7320]: E0312 12:21:18.071421 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.092881 master-0 kubenswrapper[7320]: I0312 12:21:18.092850 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.092982 master-0 kubenswrapper[7320]: I0312 12:21:18.092891 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.092982 master-0 kubenswrapper[7320]: I0312 12:21:18.092916 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.092982 master-0 kubenswrapper[7320]: I0312 12:21:18.092937 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.092982 master-0 kubenswrapper[7320]: I0312 12:21:18.092958 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.093137 master-0 kubenswrapper[7320]: I0312 12:21:18.092984 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.093137 master-0 kubenswrapper[7320]: I0312 12:21:18.093006 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.093137 master-0 kubenswrapper[7320]: I0312 12:21:18.093081 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.093239 master-0 kubenswrapper[7320]: I0312 12:21:18.093203 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.093276 master-0 kubenswrapper[7320]: I0312 12:21:18.093251 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.093313 master-0 kubenswrapper[7320]: I0312 12:21:18.093279 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.093313 master-0 kubenswrapper[7320]: I0312 12:21:18.093297 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.093468 master-0 kubenswrapper[7320]: I0312 12:21:18.093319 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.093468 master-0 kubenswrapper[7320]: I0312 12:21:18.093346 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.093468 master-0 kubenswrapper[7320]: I0312 12:21:18.093373 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.093468 master-0 kubenswrapper[7320]: I0312 12:21:18.093402 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.093468 master-0 kubenswrapper[7320]: I0312 12:21:18.093427 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194391 master-0 kubenswrapper[7320]: I0312 12:21:18.194262 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194391 master-0 kubenswrapper[7320]: I0312 12:21:18.194324 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.194391 master-0 kubenswrapper[7320]: I0312 12:21:18.194345 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194515 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194526 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194550 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194589 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194602 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194609 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194648 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194652 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194653 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194687 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194709 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194740 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194719 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194792 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194790 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194807 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194832 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194832 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194862 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194864 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194885 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194893 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.194868 master-0 kubenswrapper[7320]: I0312 12:21:18.194944 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.194966 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.194993 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.194997 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.195013 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.194967 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.195052 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.195060 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:18.196558 master-0 kubenswrapper[7320]: I0312 12:21:18.195097 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:18.572670 master-0 kubenswrapper[7320]: I0312 12:21:18.572604 7320 apiserver.go:52] "Watching apiserver" Mar 12 12:21:18.580503 master-0 kubenswrapper[7320]: I0312 12:21:18.580441 7320 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 12:21:18.581050 master-0 kubenswrapper[7320]: I0312 12:21:18.581008 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx","openshift-network-operator/network-operator-7c649bf6d4-rbb5m","openshift-ovn-kubernetes/ovnkube-node-l5d2w","openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf","openshift-multus/multus-additional-cni-plugins-r86hc","openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv","kube-system/bootstrap-kube-scheduler-master-0","openshift-marketplace/marketplace-operator-64bf9778cb-rgstx","openshift-multus/multus-hb48g","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9","openshift-network-operator/iptables-alerter-xqmw9","openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4","openshift-ingress-operator/ingress-operator-677db989d6-vpss8","openshift-multus/network-metrics-daemon-4m9jh","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h","openshift-network-diagnostics/network-check-target-dfz7x","assisted-installer/assisted-installer-controller-p2nlp","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h","openshift-dns-operator/dns-operator-589895fbb7-l8x6p","openshift-network-node-identity/network-node-identity-rzmhl","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp","openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b","openshift-multus/multus-admission-controller-8d675b596-xpzn2","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v"] Mar 12 12:21:18.581222 master-0 kubenswrapper[7320]: I0312 12:21:18.581194 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:21:18.581307 master-0 kubenswrapper[7320]: I0312 12:21:18.581279 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:18.581371 master-0 kubenswrapper[7320]: I0312 12:21:18.581336 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:18.581472 master-0 kubenswrapper[7320]: I0312 12:21:18.581418 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:18.581947 master-0 kubenswrapper[7320]: I0312 12:21:18.581914 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.583233 master-0 kubenswrapper[7320]: I0312 12:21:18.583201 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:18.583294 master-0 kubenswrapper[7320]: I0312 12:21:18.583269 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:18.583412 master-0 kubenswrapper[7320]: I0312 12:21:18.583382 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.583584 master-0 kubenswrapper[7320]: I0312 12:21:18.583553 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.583896 master-0 kubenswrapper[7320]: I0312 12:21:18.583866 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:18.583965 master-0 kubenswrapper[7320]: I0312 12:21:18.583935 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.595579 master-0 kubenswrapper[7320]: I0312 12:21:18.594406 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:18.595579 master-0 kubenswrapper[7320]: I0312 12:21:18.594563 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:18.595795 master-0 kubenswrapper[7320]: I0312 12:21:18.595688 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:18.597468 master-0 kubenswrapper[7320]: I0312 12:21:18.597411 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:18.597468 master-0 kubenswrapper[7320]: I0312 12:21:18.597470 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.597593 master-0 kubenswrapper[7320]: I0312 12:21:18.597517 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:18.597593 master-0 kubenswrapper[7320]: I0312 12:21:18.597549 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.597593 master-0 kubenswrapper[7320]: I0312 12:21:18.597585 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.597706 master-0 kubenswrapper[7320]: I0312 12:21:18.597612 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:18.597706 master-0 kubenswrapper[7320]: I0312 12:21:18.597638 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:18.597706 master-0 kubenswrapper[7320]: I0312 12:21:18.597665 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:18.597706 master-0 kubenswrapper[7320]: I0312 12:21:18.597689 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:18.597839 master-0 kubenswrapper[7320]: I0312 12:21:18.597715 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:18.597839 master-0 kubenswrapper[7320]: I0312 12:21:18.597739 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:18.597839 master-0 kubenswrapper[7320]: I0312 12:21:18.597764 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:18.597839 master-0 kubenswrapper[7320]: I0312 12:21:18.597787 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:18.597839 master-0 kubenswrapper[7320]: I0312 12:21:18.597836 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.598012 master-0 kubenswrapper[7320]: I0312 12:21:18.597864 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:18.598012 master-0 kubenswrapper[7320]: I0312 12:21:18.597896 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.598012 master-0 kubenswrapper[7320]: I0312 12:21:18.597920 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.598012 master-0 kubenswrapper[7320]: I0312 12:21:18.597964 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.598012 master-0 kubenswrapper[7320]: I0312 12:21:18.597994 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.598176 master-0 kubenswrapper[7320]: I0312 12:21:18.598023 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:18.598176 master-0 kubenswrapper[7320]: I0312 12:21:18.598047 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.598176 master-0 kubenswrapper[7320]: I0312 12:21:18.598083 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.598176 master-0 kubenswrapper[7320]: I0312 12:21:18.598116 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.598176 master-0 kubenswrapper[7320]: I0312 12:21:18.598147 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:18.598176 master-0 kubenswrapper[7320]: I0312 12:21:18.598172 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:18.598373 master-0 kubenswrapper[7320]: I0312 12:21:18.598220 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:18.598373 master-0 kubenswrapper[7320]: I0312 12:21:18.598248 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:18.598373 master-0 kubenswrapper[7320]: I0312 12:21:18.598316 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.598373 master-0 kubenswrapper[7320]: I0312 12:21:18.598350 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.598373 master-0 kubenswrapper[7320]: I0312 12:21:18.598375 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:18.598582 master-0 kubenswrapper[7320]: I0312 12:21:18.598399 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.598582 master-0 kubenswrapper[7320]: I0312 12:21:18.598419 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:18.598582 master-0 kubenswrapper[7320]: I0312 12:21:18.598574 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:18.598691 master-0 kubenswrapper[7320]: I0312 12:21:18.598611 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.598691 master-0 kubenswrapper[7320]: I0312 12:21:18.598640 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:18.598691 master-0 kubenswrapper[7320]: I0312 12:21:18.598668 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.600188 master-0 kubenswrapper[7320]: I0312 12:21:18.600145 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 12:21:18.600401 master-0 kubenswrapper[7320]: I0312 12:21:18.600370 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 12:21:18.600496 master-0 kubenswrapper[7320]: I0312 12:21:18.600445 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 12:21:18.600560 master-0 kubenswrapper[7320]: I0312 12:21:18.600540 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 12:21:18.600678 master-0 kubenswrapper[7320]: I0312 12:21:18.600648 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 12:21:18.600751 master-0 kubenswrapper[7320]: I0312 12:21:18.600721 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 12:21:18.600807 master-0 kubenswrapper[7320]: I0312 12:21:18.600763 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 12:21:18.600807 master-0 kubenswrapper[7320]: I0312 12:21:18.600770 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 12:21:18.600881 master-0 kubenswrapper[7320]: I0312 12:21:18.600872 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 12:21:18.601033 master-0 kubenswrapper[7320]: I0312 12:21:18.600935 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 12:21:18.601033 master-0 kubenswrapper[7320]: I0312 12:21:18.600974 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 12:21:18.601179 master-0 kubenswrapper[7320]: I0312 12:21:18.601155 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 12:21:18.601502 master-0 kubenswrapper[7320]: I0312 12:21:18.601330 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.601892 master-0 kubenswrapper[7320]: I0312 12:21:18.601853 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 12:21:18.601947 master-0 kubenswrapper[7320]: I0312 12:21:18.601895 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 12:21:18.602088 master-0 kubenswrapper[7320]: I0312 12:21:18.602023 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 12:21:18.602171 master-0 kubenswrapper[7320]: I0312 12:21:18.602131 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 12:21:18.602254 master-0 kubenswrapper[7320]: I0312 12:21:18.602230 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 12 12:21:18.602383 master-0 kubenswrapper[7320]: I0312 12:21:18.602353 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.602510 master-0 kubenswrapper[7320]: I0312 12:21:18.602490 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 12:21:18.602557 master-0 kubenswrapper[7320]: I0312 12:21:18.602543 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 12 12:21:18.602628 master-0 kubenswrapper[7320]: I0312 12:21:18.602608 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.602744 master-0 kubenswrapper[7320]: I0312 12:21:18.602726 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 12:21:18.602813 master-0 kubenswrapper[7320]: I0312 12:21:18.602784 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.602856 master-0 kubenswrapper[7320]: I0312 12:21:18.602844 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 12:21:18.602990 master-0 kubenswrapper[7320]: I0312 12:21:18.602969 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 12:21:18.603090 master-0 kubenswrapper[7320]: I0312 12:21:18.603067 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.603142 master-0 kubenswrapper[7320]: I0312 12:21:18.603117 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 12:21:18.603237 master-0 kubenswrapper[7320]: I0312 12:21:18.603219 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 12:21:18.603385 master-0 kubenswrapper[7320]: I0312 12:21:18.603077 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 12:21:18.603435 master-0 kubenswrapper[7320]: I0312 12:21:18.603410 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 12:21:18.603555 master-0 kubenswrapper[7320]: I0312 12:21:18.603322 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 12:21:18.603610 master-0 kubenswrapper[7320]: I0312 12:21:18.603556 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.603680 master-0 kubenswrapper[7320]: I0312 12:21:18.603661 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:21:18.603818 master-0 kubenswrapper[7320]: I0312 12:21:18.603787 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 12:21:18.604017 master-0 kubenswrapper[7320]: I0312 12:21:18.603987 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 12 12:21:18.607617 master-0 kubenswrapper[7320]: I0312 12:21:18.606421 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 12 12:21:18.607927 master-0 kubenswrapper[7320]: I0312 12:21:18.607746 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 12:21:18.607927 master-0 kubenswrapper[7320]: I0312 12:21:18.607776 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 12:21:18.607927 master-0 kubenswrapper[7320]: I0312 12:21:18.607827 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 12:21:18.607927 master-0 kubenswrapper[7320]: I0312 12:21:18.607829 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 12:21:18.608342 master-0 kubenswrapper[7320]: I0312 12:21:18.608301 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 12:21:18.608397 master-0 kubenswrapper[7320]: I0312 12:21:18.608344 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 12:21:18.608498 master-0 kubenswrapper[7320]: I0312 12:21:18.608469 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 12 12:21:18.608541 master-0 kubenswrapper[7320]: I0312 12:21:18.608465 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 12 12:21:18.608602 master-0 kubenswrapper[7320]: I0312 12:21:18.608590 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.608642 master-0 kubenswrapper[7320]: I0312 12:21:18.608629 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 12:21:18.608751 master-0 kubenswrapper[7320]: I0312 12:21:18.608499 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 12 12:21:18.608812 master-0 kubenswrapper[7320]: I0312 12:21:18.608693 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609269 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609285 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609389 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609577 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609579 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609695 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:18.609784 master-0 kubenswrapper[7320]: I0312 12:21:18.609727 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.611068 master-0 kubenswrapper[7320]: I0312 12:21:18.611036 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 12:21:18.611380 master-0 kubenswrapper[7320]: I0312 12:21:18.611356 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 12:21:18.611433 master-0 kubenswrapper[7320]: I0312 12:21:18.611403 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.611472 master-0 kubenswrapper[7320]: I0312 12:21:18.611447 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 12:21:18.611730 master-0 kubenswrapper[7320]: I0312 12:21:18.611709 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.611852 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.611904 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.611990 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.612068 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.612096 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.612185 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.612187 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.612288 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 12:21:18.612378 master-0 kubenswrapper[7320]: I0312 12:21:18.612350 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 12:21:18.612743 master-0 kubenswrapper[7320]: I0312 12:21:18.612521 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 12:21:18.612743 master-0 kubenswrapper[7320]: I0312 12:21:18.612531 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 12:21:18.614275 master-0 kubenswrapper[7320]: I0312 12:21:18.613041 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 12:21:18.614275 master-0 kubenswrapper[7320]: I0312 12:21:18.613186 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 12:21:18.625576 master-0 kubenswrapper[7320]: I0312 12:21:18.620147 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.625576 master-0 kubenswrapper[7320]: I0312 12:21:18.620375 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.639572 master-0 kubenswrapper[7320]: I0312 12:21:18.639510 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:21:18.639768 master-0 kubenswrapper[7320]: I0312 12:21:18.639704 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 12:21:18.639818 master-0 kubenswrapper[7320]: I0312 12:21:18.639806 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 12:21:18.640080 master-0 kubenswrapper[7320]: I0312 12:21:18.640048 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 12:21:18.640176 master-0 kubenswrapper[7320]: I0312 12:21:18.640149 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 12:21:18.640275 master-0 kubenswrapper[7320]: I0312 12:21:18.640248 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 12:21:18.640346 master-0 kubenswrapper[7320]: I0312 12:21:18.640329 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 12:21:18.640498 master-0 kubenswrapper[7320]: I0312 12:21:18.640448 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 12:21:18.641454 master-0 kubenswrapper[7320]: I0312 12:21:18.641396 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 12:21:18.642094 master-0 kubenswrapper[7320]: I0312 12:21:18.641959 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 12:21:18.642419 master-0 kubenswrapper[7320]: I0312 12:21:18.642388 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 12:21:18.642677 master-0 kubenswrapper[7320]: I0312 12:21:18.642653 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 12:21:18.666512 master-0 kubenswrapper[7320]: I0312 12:21:18.662967 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 12:21:18.669800 master-0 kubenswrapper[7320]: I0312 12:21:18.669696 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 12:21:18.670036 master-0 kubenswrapper[7320]: I0312 12:21:18.670013 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 12:21:18.670423 master-0 kubenswrapper[7320]: I0312 12:21:18.670395 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 12:21:18.673254 master-0 kubenswrapper[7320]: I0312 12:21:18.671825 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 12:21:18.673567 master-0 kubenswrapper[7320]: I0312 12:21:18.673534 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:18.674846 master-0 kubenswrapper[7320]: I0312 12:21:18.674815 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.675026 master-0 kubenswrapper[7320]: I0312 12:21:18.675006 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 12 12:21:18.675099 master-0 kubenswrapper[7320]: I0312 12:21:18.675067 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:21:18.675631 master-0 kubenswrapper[7320]: I0312 12:21:18.675082 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.675696 master-0 kubenswrapper[7320]: I0312 12:21:18.675598 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:18.675798 master-0 kubenswrapper[7320]: I0312 12:21:18.675769 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 12:21:18.675834 master-0 kubenswrapper[7320]: I0312 12:21:18.675758 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:21:18.675969 master-0 kubenswrapper[7320]: I0312 12:21:18.675947 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 12:21:18.676004 master-0 kubenswrapper[7320]: I0312 12:21:18.675946 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:21:18.676036 master-0 kubenswrapper[7320]: I0312 12:21:18.675999 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:18.676064 master-0 kubenswrapper[7320]: I0312 12:21:18.676022 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.676102 master-0 kubenswrapper[7320]: I0312 12:21:18.676080 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.676353 master-0 kubenswrapper[7320]: I0312 12:21:18.676332 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:21:18.676516 master-0 kubenswrapper[7320]: I0312 12:21:18.676487 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:18.677772 master-0 kubenswrapper[7320]: I0312 12:21:18.677299 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:21:18.677983 master-0 kubenswrapper[7320]: I0312 12:21:18.677930 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.681570 master-0 kubenswrapper[7320]: I0312 12:21:18.681316 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 12 12:21:18.684928 master-0 kubenswrapper[7320]: I0312 12:21:18.684887 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 12:21:18.686310 master-0 kubenswrapper[7320]: I0312 12:21:18.686153 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 12:21:18.689001 master-0 kubenswrapper[7320]: I0312 12:21:18.688973 7320 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 12 12:21:18.690040 master-0 kubenswrapper[7320]: I0312 12:21:18.689932 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.694668 master-0 kubenswrapper[7320]: I0312 12:21:18.694644 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.699627 master-0 kubenswrapper[7320]: I0312 12:21:18.699592 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.699704 master-0 kubenswrapper[7320]: I0312 12:21:18.699639 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:18.699704 master-0 kubenswrapper[7320]: I0312 12:21:18.699662 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:18.699704 master-0 kubenswrapper[7320]: I0312 12:21:18.699679 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.699704 master-0 kubenswrapper[7320]: I0312 12:21:18.699698 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.699872 master-0 kubenswrapper[7320]: I0312 12:21:18.699810 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.700082 master-0 kubenswrapper[7320]: I0312 12:21:18.700046 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.700082 master-0 kubenswrapper[7320]: I0312 12:21:18.700082 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.700245 master-0 kubenswrapper[7320]: I0312 12:21:18.700099 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:18.700245 master-0 kubenswrapper[7320]: I0312 12:21:18.700110 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:18.700245 master-0 kubenswrapper[7320]: I0312 12:21:18.700119 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.700245 master-0 kubenswrapper[7320]: I0312 12:21:18.700192 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.700245 master-0 kubenswrapper[7320]: I0312 12:21:18.700231 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:18.700438 master-0 kubenswrapper[7320]: I0312 12:21:18.700274 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.700438 master-0 kubenswrapper[7320]: I0312 12:21:18.700310 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.700438 master-0 kubenswrapper[7320]: I0312 12:21:18.700344 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:18.700438 master-0 kubenswrapper[7320]: I0312 12:21:18.700389 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:18.700438 master-0 kubenswrapper[7320]: E0312 12:21:18.700415 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:18.700640 master-0 kubenswrapper[7320]: I0312 12:21:18.700421 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.700640 master-0 kubenswrapper[7320]: E0312 12:21:18.700508 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.200457099 +0000 UTC m=+1.759500980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:18.700640 master-0 kubenswrapper[7320]: I0312 12:21:18.700577 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:18.700640 master-0 kubenswrapper[7320]: I0312 12:21:18.700615 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:18.700796 master-0 kubenswrapper[7320]: I0312 12:21:18.700669 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.700796 master-0 kubenswrapper[7320]: I0312 12:21:18.700708 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:18.700796 master-0 kubenswrapper[7320]: I0312 12:21:18.700741 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:18.700796 master-0 kubenswrapper[7320]: I0312 12:21:18.700763 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:18.700796 master-0 kubenswrapper[7320]: I0312 12:21:18.700786 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.700981 master-0 kubenswrapper[7320]: I0312 12:21:18.700812 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:18.700981 master-0 kubenswrapper[7320]: I0312 12:21:18.700841 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.700981 master-0 kubenswrapper[7320]: I0312 12:21:18.700896 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.700981 master-0 kubenswrapper[7320]: I0312 12:21:18.700963 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.700988 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.701014 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.701041 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.701044 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.701074 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.701097 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.701149 master-0 kubenswrapper[7320]: I0312 12:21:18.701116 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:18.701394 master-0 kubenswrapper[7320]: I0312 12:21:18.701158 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.701394 master-0 kubenswrapper[7320]: I0312 12:21:18.701191 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:18.701394 master-0 kubenswrapper[7320]: I0312 12:21:18.701210 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.701394 master-0 kubenswrapper[7320]: I0312 12:21:18.701242 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.701394 master-0 kubenswrapper[7320]: I0312 12:21:18.701260 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:18.701394 master-0 kubenswrapper[7320]: I0312 12:21:18.701286 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:18.701630 master-0 kubenswrapper[7320]: I0312 12:21:18.701455 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:18.701630 master-0 kubenswrapper[7320]: E0312 12:21:18.701548 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:18.701630 master-0 kubenswrapper[7320]: E0312 12:21:18.701593 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.201575006 +0000 UTC m=+1.760618887 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:18.701743 master-0 kubenswrapper[7320]: I0312 12:21:18.701644 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.701743 master-0 kubenswrapper[7320]: I0312 12:21:18.701703 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.701821 master-0 kubenswrapper[7320]: I0312 12:21:18.701778 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:18.701821 master-0 kubenswrapper[7320]: I0312 12:21:18.701812 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.702292 master-0 kubenswrapper[7320]: I0312 12:21:18.701886 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.702292 master-0 kubenswrapper[7320]: I0312 12:21:18.701782 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:18.702292 master-0 kubenswrapper[7320]: I0312 12:21:18.701971 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.702292 master-0 kubenswrapper[7320]: I0312 12:21:18.702020 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.702292 master-0 kubenswrapper[7320]: I0312 12:21:18.702067 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:18.702470 master-0 kubenswrapper[7320]: I0312 12:21:18.702300 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.702470 master-0 kubenswrapper[7320]: I0312 12:21:18.702385 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.702470 master-0 kubenswrapper[7320]: E0312 12:21:18.702399 7320 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:18.702470 master-0 kubenswrapper[7320]: E0312 12:21:18.702450 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.202421176 +0000 UTC m=+1.761465057 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:18.702470 master-0 kubenswrapper[7320]: I0312 12:21:18.702417 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:18.702779 master-0 kubenswrapper[7320]: I0312 12:21:18.702537 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.702779 master-0 kubenswrapper[7320]: I0312 12:21:18.702564 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.702779 master-0 kubenswrapper[7320]: I0312 12:21:18.702582 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:18.702779 master-0 kubenswrapper[7320]: I0312 12:21:18.702598 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.702779 master-0 kubenswrapper[7320]: I0312 12:21:18.702617 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:18.702779 master-0 kubenswrapper[7320]: I0312 12:21:18.702632 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703078 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703089 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703101 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703144 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703161 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703277 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703413 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703425 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703441 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703501 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703522 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703537 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.703564 master-0 kubenswrapper[7320]: I0312 12:21:18.703552 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703637 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703660 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703680 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703700 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703792 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703823 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703850 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703867 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703899 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703905 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703920 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703911 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.703978 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.704031 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.704119 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: I0312 12:21:18.704180 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: E0312 12:21:18.704201 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: E0312 12:21:18.704208 7320 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:18.704240 master-0 kubenswrapper[7320]: E0312 12:21:18.704266 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.20423519 +0000 UTC m=+1.763279061 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: E0312 12:21:18.704285 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.204279291 +0000 UTC m=+1.763323172 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704374 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704403 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704433 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704515 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704551 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704586 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704623 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704678 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704710 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704724 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704743 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704778 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704812 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704842 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704873 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704590 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704908 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704944 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.704978 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.705035 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.705261 master-0 kubenswrapper[7320]: I0312 12:21:18.705037 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705363 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705536 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705490 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705712 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705707 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705762 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705787 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705795 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705803 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705845 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705845 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705867 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.705988 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.706006 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.706022 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.706048 master-0 kubenswrapper[7320]: I0312 12:21:18.706039 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706109 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706135 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706171 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706223 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706237 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706302 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706307 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706338 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706338 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706367 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706441 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706468 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706469 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706496 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706515 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706535 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706552 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706567 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706681 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:18.706978 master-0 kubenswrapper[7320]: I0312 12:21:18.706703 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:18.714676 master-0 kubenswrapper[7320]: I0312 12:21:18.714602 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 12:21:18.734278 master-0 kubenswrapper[7320]: I0312 12:21:18.734231 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 12 12:21:18.743726 master-0 kubenswrapper[7320]: E0312 12:21:18.743674 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:18.743933 master-0 kubenswrapper[7320]: E0312 12:21:18.743848 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.243829139 +0000 UTC m=+1.802873020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:18.754430 master-0 kubenswrapper[7320]: I0312 12:21:18.754365 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 12:21:18.774099 master-0 kubenswrapper[7320]: I0312 12:21:18.774053 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 12:21:18.779142 master-0 kubenswrapper[7320]: I0312 12:21:18.779110 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:18.794534 master-0 kubenswrapper[7320]: I0312 12:21:18.794489 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 12:21:18.800104 master-0 kubenswrapper[7320]: I0312 12:21:18.800076 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:18.807689 master-0 kubenswrapper[7320]: I0312 12:21:18.807664 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.807737 master-0 kubenswrapper[7320]: I0312 12:21:18.807700 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808054 master-0 kubenswrapper[7320]: I0312 12:21:18.808030 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808389 master-0 kubenswrapper[7320]: I0312 12:21:18.807752 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.808439 master-0 kubenswrapper[7320]: I0312 12:21:18.808414 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808505 master-0 kubenswrapper[7320]: I0312 12:21:18.808469 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808546 master-0 kubenswrapper[7320]: I0312 12:21:18.808518 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.808579 master-0 kubenswrapper[7320]: I0312 12:21:18.808555 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:18.808626 master-0 kubenswrapper[7320]: I0312 12:21:18.808576 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808626 master-0 kubenswrapper[7320]: I0312 12:21:18.808598 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:18.808685 master-0 kubenswrapper[7320]: I0312 12:21:18.808626 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.808712 master-0 kubenswrapper[7320]: I0312 12:21:18.807999 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808816 master-0 kubenswrapper[7320]: I0312 12:21:18.808792 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808852 master-0 kubenswrapper[7320]: I0312 12:21:18.808832 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.808890 master-0 kubenswrapper[7320]: I0312 12:21:18.808855 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.808922 master-0 kubenswrapper[7320]: I0312 12:21:18.808891 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.808922 master-0 kubenswrapper[7320]: I0312 12:21:18.808914 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.808974 master-0 kubenswrapper[7320]: I0312 12:21:18.808942 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809092 master-0 kubenswrapper[7320]: I0312 12:21:18.809067 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.809124 master-0 kubenswrapper[7320]: I0312 12:21:18.809110 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.809154 master-0 kubenswrapper[7320]: I0312 12:21:18.809140 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809271 master-0 kubenswrapper[7320]: I0312 12:21:18.809234 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.809332 master-0 kubenswrapper[7320]: I0312 12:21:18.809276 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.809373 master-0 kubenswrapper[7320]: E0312 12:21:18.809355 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:18.809645 master-0 kubenswrapper[7320]: I0312 12:21:18.809592 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:18.809744 master-0 kubenswrapper[7320]: I0312 12:21:18.809721 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.809775 master-0 kubenswrapper[7320]: I0312 12:21:18.809601 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809775 master-0 kubenswrapper[7320]: E0312 12:21:18.809623 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:18.809775 master-0 kubenswrapper[7320]: I0312 12:21:18.809730 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.809775 master-0 kubenswrapper[7320]: I0312 12:21:18.809764 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809894 master-0 kubenswrapper[7320]: I0312 12:21:18.809751 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.809894 master-0 kubenswrapper[7320]: I0312 12:21:18.809786 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809894 master-0 kubenswrapper[7320]: E0312 12:21:18.809745 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.30972973 +0000 UTC m=+1.868773611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:18.809894 master-0 kubenswrapper[7320]: I0312 12:21:18.809820 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809894 master-0 kubenswrapper[7320]: I0312 12:21:18.809848 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.809894 master-0 kubenswrapper[7320]: I0312 12:21:18.809852 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.809903 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.809884 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.809906 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.809866 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: E0312 12:21:18.809944 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.309924995 +0000 UTC m=+1.868968876 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.810003 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.810024 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: I0312 12:21:18.810041 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810060 master-0 kubenswrapper[7320]: E0312 12:21:18.810058 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810066 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: E0312 12:21:18.810084 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.310075428 +0000 UTC m=+1.869119309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810099 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810127 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810131 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810144 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: E0312 12:21:18.810176 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: E0312 12:21:18.810196 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.310189591 +0000 UTC m=+1.869233472 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810155 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810199 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810224 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810227 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810243 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810291 master-0 kubenswrapper[7320]: I0312 12:21:18.810259 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810320 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810352 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810361 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810420 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810447 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810494 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810501 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810521 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810548 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810596 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810622 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810672 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:18.810696 master-0 kubenswrapper[7320]: I0312 12:21:18.810696 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810735 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810758 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810799 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810821 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810913 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810944 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.810983 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: E0312 12:21:18.811047 7320 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: E0312 12:21:18.811092 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.311075282 +0000 UTC m=+1.870119163 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.811123 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: E0312 12:21:18.811171 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: E0312 12:21:18.811199 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.311190675 +0000 UTC m=+1.870234556 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.811224 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:18.811456 master-0 kubenswrapper[7320]: I0312 12:21:18.811253 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:18.814799 master-0 kubenswrapper[7320]: I0312 12:21:18.814775 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.815461 master-0 kubenswrapper[7320]: I0312 12:21:18.815426 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:18.835216 master-0 kubenswrapper[7320]: I0312 12:21:18.835091 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:21:18.836610 master-0 kubenswrapper[7320]: I0312 12:21:18.836564 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:21:18.855379 master-0 kubenswrapper[7320]: I0312 12:21:18.855336 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 12:21:18.861003 master-0 kubenswrapper[7320]: E0312 12:21:18.860979 7320 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:18.861695 master-0 kubenswrapper[7320]: E0312 12:21:18.861045 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:19.36102746 +0000 UTC m=+1.920071341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:18.874790 master-0 kubenswrapper[7320]: I0312 12:21:18.874542 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 12:21:18.885213 master-0 kubenswrapper[7320]: I0312 12:21:18.884711 7320 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:21:18.894301 master-0 kubenswrapper[7320]: I0312 12:21:18.894272 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 12:21:18.914309 master-0 kubenswrapper[7320]: I0312 12:21:18.914266 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 12:21:18.967532 master-0 kubenswrapper[7320]: I0312 12:21:18.967400 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:18.989905 master-0 kubenswrapper[7320]: I0312 12:21:18.989846 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:19.004966 master-0 kubenswrapper[7320]: I0312 12:21:19.004912 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:21:19.029328 master-0 kubenswrapper[7320]: I0312 12:21:19.029262 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:21:19.051518 master-0 kubenswrapper[7320]: I0312 12:21:19.051443 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:21:19.070640 master-0 kubenswrapper[7320]: I0312 12:21:19.070603 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:21:19.087564 master-0 kubenswrapper[7320]: I0312 12:21:19.087425 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:21:19.107070 master-0 kubenswrapper[7320]: I0312 12:21:19.107007 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:21:19.148531 master-0 kubenswrapper[7320]: I0312 12:21:19.148453 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:21:19.149406 master-0 kubenswrapper[7320]: I0312 12:21:19.149376 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:19.168621 master-0 kubenswrapper[7320]: I0312 12:21:19.168578 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:19.190658 master-0 kubenswrapper[7320]: I0312 12:21:19.190627 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:19.211190 master-0 kubenswrapper[7320]: I0312 12:21:19.211120 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:19.217125 master-0 kubenswrapper[7320]: I0312 12:21:19.217052 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:19.217265 master-0 kubenswrapper[7320]: E0312 12:21:19.217236 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:19.217348 master-0 kubenswrapper[7320]: E0312 12:21:19.217321 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.217299935 +0000 UTC m=+2.776343806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:19.218025 master-0 kubenswrapper[7320]: I0312 12:21:19.217986 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:19.218353 master-0 kubenswrapper[7320]: I0312 12:21:19.218310 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:19.219891 master-0 kubenswrapper[7320]: E0312 12:21:19.218223 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:19.220134 master-0 kubenswrapper[7320]: E0312 12:21:19.220111 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.220082862 +0000 UTC m=+2.779126783 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:19.220271 master-0 kubenswrapper[7320]: E0312 12:21:19.218416 7320 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:19.220400 master-0 kubenswrapper[7320]: E0312 12:21:19.220002 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:19.220577 master-0 kubenswrapper[7320]: E0312 12:21:19.220535 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.220464361 +0000 UTC m=+2.779508292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:19.220739 master-0 kubenswrapper[7320]: I0312 12:21:19.220707 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:19.220998 master-0 kubenswrapper[7320]: E0312 12:21:19.220734 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.220711537 +0000 UTC m=+2.779755518 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:19.221664 master-0 kubenswrapper[7320]: I0312 12:21:19.221634 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:19.221846 master-0 kubenswrapper[7320]: E0312 12:21:19.221806 7320 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:19.221923 master-0 kubenswrapper[7320]: E0312 12:21:19.221884 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.221871455 +0000 UTC m=+2.780915416 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:19.237135 master-0 kubenswrapper[7320]: I0312 12:21:19.237101 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:19.246673 master-0 kubenswrapper[7320]: I0312 12:21:19.246648 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:19.265454 master-0 kubenswrapper[7320]: I0312 12:21:19.265385 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:21:19.284750 master-0 kubenswrapper[7320]: I0312 12:21:19.284708 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:21:19.306411 master-0 kubenswrapper[7320]: I0312 12:21:19.306342 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:21:19.330274 master-0 kubenswrapper[7320]: I0312 12:21:19.330217 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:19.330274 master-0 kubenswrapper[7320]: I0312 12:21:19.330288 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:19.330761 master-0 kubenswrapper[7320]: E0312 12:21:19.330511 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:19.330761 master-0 kubenswrapper[7320]: E0312 12:21:19.330579 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.330559832 +0000 UTC m=+2.889603773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:19.331063 master-0 kubenswrapper[7320]: I0312 12:21:19.330978 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:19.331063 master-0 kubenswrapper[7320]: I0312 12:21:19.331024 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:19.331206 master-0 kubenswrapper[7320]: E0312 12:21:19.331094 7320 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:19.331206 master-0 kubenswrapper[7320]: E0312 12:21:19.331127 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.331115925 +0000 UTC m=+2.890159816 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:19.331206 master-0 kubenswrapper[7320]: E0312 12:21:19.331131 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:19.331206 master-0 kubenswrapper[7320]: E0312 12:21:19.331197 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.331178087 +0000 UTC m=+2.890222028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331220 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: I0312 12:21:19.331222 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331251 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.331240998 +0000 UTC m=+2.890284989 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: I0312 12:21:19.331271 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331287 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: I0312 12:21:19.331308 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331320 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.33130698 +0000 UTC m=+2.890350961 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331387 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331409 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.331403122 +0000 UTC m=+2.890447003 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331433 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: E0312 12:21:19.331551 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.331525905 +0000 UTC m=+2.890569866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:19.331613 master-0 kubenswrapper[7320]: I0312 12:21:19.331564 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:19.349632 master-0 kubenswrapper[7320]: I0312 12:21:19.349520 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:19.367211 master-0 kubenswrapper[7320]: I0312 12:21:19.367121 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:19.385273 master-0 kubenswrapper[7320]: I0312 12:21:19.385198 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:21:19.405273 master-0 kubenswrapper[7320]: I0312 12:21:19.404995 7320 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 12:21:19.409973 master-0 kubenswrapper[7320]: I0312 12:21:19.409913 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:19.421857 master-0 kubenswrapper[7320]: E0312 12:21:19.421579 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:21:19.432345 master-0 kubenswrapper[7320]: I0312 12:21:19.431826 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:19.432345 master-0 kubenswrapper[7320]: E0312 12:21:19.432046 7320 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:19.432345 master-0 kubenswrapper[7320]: E0312 12:21:19.432093 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:20.432080557 +0000 UTC m=+2.991124438 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:19.440063 master-0 kubenswrapper[7320]: E0312 12:21:19.439742 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:19.460310 master-0 kubenswrapper[7320]: W0312 12:21:19.459836 7320 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 12 12:21:19.460310 master-0 kubenswrapper[7320]: E0312 12:21:19.459902 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:21:19.501816 master-0 kubenswrapper[7320]: I0312 12:21:19.500054 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:19.690524 master-0 kubenswrapper[7320]: E0312 12:21:19.690357 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460" Mar 12 12:21:19.691232 master-0 kubenswrapper[7320]: E0312 12:21:19.690603 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lnbq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-xqmw9_openshift-network-operator(54612733-158f-4a92-a1bf-f4a8d653ffaf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:19.692697 master-0 kubenswrapper[7320]: E0312 12:21:19.692637 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-network-operator/iptables-alerter-xqmw9" podUID="54612733-158f-4a92-a1bf-f4a8d653ffaf" Mar 12 12:21:20.240568 master-0 kubenswrapper[7320]: E0312 12:21:20.240515 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:20.240736 master-0 kubenswrapper[7320]: E0312 12:21:20.240606 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.240589838 +0000 UTC m=+4.799633719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:20.241129 master-0 kubenswrapper[7320]: I0312 12:21:20.240357 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:20.241228 master-0 kubenswrapper[7320]: I0312 12:21:20.241155 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:20.241292 master-0 kubenswrapper[7320]: I0312 12:21:20.241264 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:20.241512 master-0 kubenswrapper[7320]: E0312 12:21:20.241453 7320 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:20.241597 master-0 kubenswrapper[7320]: E0312 12:21:20.241527 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:20.241597 master-0 kubenswrapper[7320]: E0312 12:21:20.241553 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.241546051 +0000 UTC m=+4.800589932 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:20.241597 master-0 kubenswrapper[7320]: E0312 12:21:20.241589 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.241567581 +0000 UTC m=+4.800611532 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:20.241724 master-0 kubenswrapper[7320]: I0312 12:21:20.241461 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:20.241724 master-0 kubenswrapper[7320]: E0312 12:21:20.241609 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:20.241724 master-0 kubenswrapper[7320]: E0312 12:21:20.241631 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.241625363 +0000 UTC m=+4.800669244 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:20.241724 master-0 kubenswrapper[7320]: I0312 12:21:20.241682 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:20.241881 master-0 kubenswrapper[7320]: E0312 12:21:20.241819 7320 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:20.241881 master-0 kubenswrapper[7320]: E0312 12:21:20.241861 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.241840288 +0000 UTC m=+4.800896119 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:20.251778 master-0 kubenswrapper[7320]: E0312 12:21:20.251726 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9" Mar 12 12:21:20.251945 master-0 kubenswrapper[7320]: E0312 12:21:20.251889 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m9gmt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-7f65c457f5-qpd6h_openshift-kube-storage-version-migrator-operator(0aeeef2a-f9df-4f87-b985-bd1da94c76c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:20.256701 master-0 kubenswrapper[7320]: E0312 12:21:20.256642 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" podUID="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" Mar 12 12:21:20.275139 master-0 kubenswrapper[7320]: I0312 12:21:20.275099 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:20.342715 master-0 kubenswrapper[7320]: I0312 12:21:20.342666 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:20.342912 master-0 kubenswrapper[7320]: I0312 12:21:20.342723 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:20.342912 master-0 kubenswrapper[7320]: I0312 12:21:20.342752 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:20.342990 master-0 kubenswrapper[7320]: E0312 12:21:20.342944 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:20.343075 master-0 kubenswrapper[7320]: I0312 12:21:20.343007 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:20.343121 master-0 kubenswrapper[7320]: E0312 12:21:20.343087 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.343068046 +0000 UTC m=+4.902111927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:20.343156 master-0 kubenswrapper[7320]: E0312 12:21:20.343137 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:20.343182 master-0 kubenswrapper[7320]: I0312 12:21:20.343150 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:20.343215 master-0 kubenswrapper[7320]: E0312 12:21:20.343186 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:20.343215 master-0 kubenswrapper[7320]: E0312 12:21:20.343198 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.343179859 +0000 UTC m=+4.902223830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:20.343215 master-0 kubenswrapper[7320]: E0312 12:21:20.343212 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:20.343293 master-0 kubenswrapper[7320]: E0312 12:21:20.343238 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.34322327 +0000 UTC m=+4.902267151 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:20.343293 master-0 kubenswrapper[7320]: E0312 12:21:20.343268 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:20.343293 master-0 kubenswrapper[7320]: E0312 12:21:20.343280 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.343273001 +0000 UTC m=+4.902316882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:20.343372 master-0 kubenswrapper[7320]: E0312 12:21:20.343298 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.343288831 +0000 UTC m=+4.902332832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:20.343372 master-0 kubenswrapper[7320]: I0312 12:21:20.343266 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:20.343372 master-0 kubenswrapper[7320]: E0312 12:21:20.343319 7320 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:20.343372 master-0 kubenswrapper[7320]: I0312 12:21:20.343344 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:20.343372 master-0 kubenswrapper[7320]: E0312 12:21:20.343370 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:20.343541 master-0 kubenswrapper[7320]: E0312 12:21:20.343392 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.343383954 +0000 UTC m=+4.902427835 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:20.343541 master-0 kubenswrapper[7320]: E0312 12:21:20.343406 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.343399164 +0000 UTC m=+4.902443045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:20.444072 master-0 kubenswrapper[7320]: I0312 12:21:20.444000 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:20.444279 master-0 kubenswrapper[7320]: E0312 12:21:20.444175 7320 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:20.444279 master-0 kubenswrapper[7320]: E0312 12:21:20.444247 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:22.444228232 +0000 UTC m=+5.003272193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:20.768451 master-0 kubenswrapper[7320]: E0312 12:21:20.768379 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56" Mar 12 12:21:20.768880 master-0 kubenswrapper[7320]: E0312 12:21:20.768683 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56,Command:[cluster-kube-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56,ValueFrom:nil,},EnvVar{Name:CLUSTER_POLICY_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35768a0c3eb24134dd38633e8acfc7db69ee96b2fd660e9bba3b8c996452fef7,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-controller-manager-operator-86d7cdfdfb-d4htx_openshift-kube-controller-manager-operator(6571f5e5-07ee-4e6c-a8ad-277bc52e35ee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:20.770663 master-0 kubenswrapper[7320]: E0312 12:21:20.770621 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" podUID="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" Mar 12 12:21:21.325752 master-0 kubenswrapper[7320]: E0312 12:21:21.325450 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab" Mar 12 12:21:21.325752 master-0 kubenswrapper[7320]: E0312 12:21:21.325672 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-apiserver-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab,Command:[cluster-openshift-apiserver-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_APISERVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbztv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-apiserver-operator-799b6db4d7-gc2gv_openshift-apiserver-operator(ea80247e-b4dd-45dc-8255-6e68508c8480): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:21.326966 master-0 kubenswrapper[7320]: E0312 12:21:21.326909 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" podUID="ea80247e-b4dd-45dc-8255-6e68508c8480" Mar 12 12:21:21.416461 master-0 kubenswrapper[7320]: I0312 12:21:21.416404 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:21.759621 master-0 kubenswrapper[7320]: I0312 12:21:21.759581 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:21.766047 master-0 kubenswrapper[7320]: I0312 12:21:21.766022 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:21.812913 master-0 kubenswrapper[7320]: E0312 12:21:21.812832 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b" Mar 12 12:21:21.813320 master-0 kubenswrapper[7320]: E0312 12:21:21.813073 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b,Command:[cluster-openshift-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:ROUTE_CONTROLLER_MANAGER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fqcrz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-controller-manager-operator-8565d84698-cg7rd_openshift-controller-manager-operator(b5890f0c-cebe-4788-89f7-27568d875741): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:21.814444 master-0 kubenswrapper[7320]: E0312 12:21:21.814391 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" podUID="b5890f0c-cebe-4788-89f7-27568d875741" Mar 12 12:21:21.881042 master-0 kubenswrapper[7320]: I0312 12:21:21.881011 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:21:22.264089 master-0 kubenswrapper[7320]: I0312 12:21:22.263975 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:22.264089 master-0 kubenswrapper[7320]: I0312 12:21:22.264049 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:22.264444 master-0 kubenswrapper[7320]: I0312 12:21:22.264208 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:22.264444 master-0 kubenswrapper[7320]: E0312 12:21:22.264215 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:22.264444 master-0 kubenswrapper[7320]: E0312 12:21:22.264335 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.264317227 +0000 UTC m=+8.823361108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:22.264444 master-0 kubenswrapper[7320]: I0312 12:21:22.264268 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:22.264444 master-0 kubenswrapper[7320]: I0312 12:21:22.264383 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264348 7320 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264556 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.264548963 +0000 UTC m=+8.823592844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264379 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264580 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.264575193 +0000 UTC m=+8.823619064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264529 7320 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264598 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.264593914 +0000 UTC m=+8.823637795 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:22.264611 master-0 kubenswrapper[7320]: E0312 12:21:22.264258 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:22.264786 master-0 kubenswrapper[7320]: E0312 12:21:22.264621 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.264616164 +0000 UTC m=+8.823660045 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:22.339884 master-0 kubenswrapper[7320]: E0312 12:21:22.339830 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3" Mar 12 12:21:22.340091 master-0 kubenswrapper[7320]: E0312 12:21:22.340030 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwfct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-5884b9cd56-7nb6b_openshift-etcd-operator(ab087440-bdf2-4e2f-9a5a-434d50a2329a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:22.341342 master-0 kubenswrapper[7320]: E0312 12:21:22.341293 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" podUID="ab087440-bdf2-4e2f-9a5a-434d50a2329a" Mar 12 12:21:22.365853 master-0 kubenswrapper[7320]: I0312 12:21:22.365798 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:22.366062 master-0 kubenswrapper[7320]: E0312 12:21:22.365966 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:22.366062 master-0 kubenswrapper[7320]: I0312 12:21:22.366036 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:22.366132 master-0 kubenswrapper[7320]: E0312 12:21:22.366046 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366026487 +0000 UTC m=+8.925070408 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:22.366132 master-0 kubenswrapper[7320]: I0312 12:21:22.366081 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:22.366132 master-0 kubenswrapper[7320]: I0312 12:21:22.366107 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:22.366229 master-0 kubenswrapper[7320]: I0312 12:21:22.366155 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:22.366229 master-0 kubenswrapper[7320]: I0312 12:21:22.366187 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:22.366229 master-0 kubenswrapper[7320]: I0312 12:21:22.366225 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:22.366311 master-0 kubenswrapper[7320]: E0312 12:21:22.366295 7320 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:22.366341 master-0 kubenswrapper[7320]: E0312 12:21:22.366322 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366312544 +0000 UTC m=+8.925356425 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:22.366453 master-0 kubenswrapper[7320]: E0312 12:21:22.366358 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:22.366453 master-0 kubenswrapper[7320]: E0312 12:21:22.366376 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366370785 +0000 UTC m=+8.925414666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:22.366453 master-0 kubenswrapper[7320]: E0312 12:21:22.366408 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:22.366453 master-0 kubenswrapper[7320]: E0312 12:21:22.366424 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366419386 +0000 UTC m=+8.925463267 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:22.366645 master-0 kubenswrapper[7320]: E0312 12:21:22.366456 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:22.366645 master-0 kubenswrapper[7320]: E0312 12:21:22.366457 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:22.366645 master-0 kubenswrapper[7320]: E0312 12:21:22.366472 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366467807 +0000 UTC m=+8.925511688 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:22.366645 master-0 kubenswrapper[7320]: E0312 12:21:22.366510 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366499598 +0000 UTC m=+8.925543479 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:22.366645 master-0 kubenswrapper[7320]: E0312 12:21:22.366519 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:22.366645 master-0 kubenswrapper[7320]: E0312 12:21:22.366535 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.366530419 +0000 UTC m=+8.925574300 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:22.467648 master-0 kubenswrapper[7320]: I0312 12:21:22.467589 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:22.467873 master-0 kubenswrapper[7320]: E0312 12:21:22.467755 7320 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:22.467873 master-0 kubenswrapper[7320]: E0312 12:21:22.467823 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:26.467804708 +0000 UTC m=+9.026848589 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:22.666755 master-0 kubenswrapper[7320]: E0312 12:21:22.666577 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43" Mar 12 12:21:22.666944 master-0 kubenswrapper[7320]: E0312 12:21:22.666831 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:openshift-api,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43,Command:[write-available-featuresets --asset-output-dir=/available-featuregates --payload-version=$(OPERATOR_IMAGE_VERSION)],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x8mvz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-64488f9d78-fg5mg_openshift-config-operator(a154f648-b96d-449e-b0f5-ba32266000c2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:22.668258 master-0 kubenswrapper[7320]: E0312 12:21:22.668203 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" podUID="a154f648-b96d-449e-b0f5-ba32266000c2" Mar 12 12:21:23.124680 master-0 kubenswrapper[7320]: E0312 12:21:23.124598 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3" Mar 12 12:21:23.125164 master-0 kubenswrapper[7320]: E0312 12:21:23.124790 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:csi-snapshot-controller-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3,Command:[],Args:[start -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERAND_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1,ValueFrom:nil,},EnvVar{Name:WEBHOOK_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5e9989ee0577e930adcd97085176343a881bf92537dda1bf0325a3b1faf96d6,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9bx48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000150000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-snapshot-controller-operator-5685fbc7d-vmj4h_openshift-cluster-storage-operator(5a012d0b-d1a8-4cd3-8b91-b346d0445f24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:23.125942 master-0 kubenswrapper[7320]: E0312 12:21:23.125898 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" podUID="5a012d0b-d1a8-4cd3-8b91-b346d0445f24" Mar 12 12:21:23.378964 master-0 kubenswrapper[7320]: I0312 12:21:23.378803 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:23.383635 master-0 kubenswrapper[7320]: I0312 12:21:23.383552 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:23.625744 master-0 kubenswrapper[7320]: I0312 12:21:23.625686 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:23.628494 master-0 kubenswrapper[7320]: E0312 12:21:23.628407 7320 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba" Mar 12 12:21:23.628824 master-0 kubenswrapper[7320]: E0312 12:21:23.628715 7320 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-svpvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-69b6fc6b88-s2gsp_openshift-service-ca-operator(55bf535c-93ab-4870-a9d2-c02496d71ef0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 12 12:21:23.630241 master-0 kubenswrapper[7320]: E0312 12:21:23.630177 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" podUID="55bf535c-93ab-4870-a9d2-c02496d71ef0" Mar 12 12:21:23.669075 master-0 kubenswrapper[7320]: I0312 12:21:23.669030 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:23.825257 master-0 kubenswrapper[7320]: I0312 12:21:23.824816 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-dfz7x"] Mar 12 12:21:23.883775 master-0 kubenswrapper[7320]: I0312 12:21:23.883421 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dfz7x" event={"ID":"269d77d9-815e-4324-8827-1ce429063ed1","Type":"ContainerStarted","Data":"43388f3fc4030bd546e0a3b05c45ff8414138bbab582c7b9e9531efe462ae9bb"} Mar 12 12:21:23.884966 master-0 kubenswrapper[7320]: I0312 12:21:23.884914 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerStarted","Data":"84b74517ff1cd7d9b9c2c7332d3d21e7e4c1bd81a8eed9a021787271d5de1935"} Mar 12 12:21:23.886793 master-0 kubenswrapper[7320]: I0312 12:21:23.886184 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" event={"ID":"a346ac54-02fe-417f-a49d-038e45b13a1d","Type":"ContainerStarted","Data":"ebdafa22bf6b7ed28319fcbd34230d3e124233b075083adefec56e18e5a788b3"} Mar 12 12:21:23.886793 master-0 kubenswrapper[7320]: I0312 12:21:23.886263 7320 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:21:23.886793 master-0 kubenswrapper[7320]: I0312 12:21:23.886280 7320 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:21:24.086508 master-0 kubenswrapper[7320]: I0312 12:21:24.086394 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:24.091345 master-0 kubenswrapper[7320]: I0312 12:21:24.091297 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:24.891753 master-0 kubenswrapper[7320]: I0312 12:21:24.891674 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-dfz7x" event={"ID":"269d77d9-815e-4324-8827-1ce429063ed1","Type":"ContainerStarted","Data":"4ea85637e51e6ec71de9b67d37261c4224785c4c676d6ec4196688c7159484ff"} Mar 12 12:21:24.895574 master-0 kubenswrapper[7320]: I0312 12:21:24.895527 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" event={"ID":"9b960fe2-d59e-4ee1-bd9d-455b46753cb9","Type":"ContainerStarted","Data":"2ecd7f48de11aae6e5506fd79aa229be8956b481497e7ee996afbf26849c14c9"} Mar 12 12:21:24.899948 master-0 kubenswrapper[7320]: I0312 12:21:24.897460 7320 generic.go:334] "Generic (PLEG): container finished" podID="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" containerID="84b74517ff1cd7d9b9c2c7332d3d21e7e4c1bd81a8eed9a021787271d5de1935" exitCode=0 Mar 12 12:21:24.899948 master-0 kubenswrapper[7320]: I0312 12:21:24.897814 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerDied","Data":"84b74517ff1cd7d9b9c2c7332d3d21e7e4c1bd81a8eed9a021787271d5de1935"} Mar 12 12:21:25.579500 master-0 kubenswrapper[7320]: I0312 12:21:25.578980 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:26.315607 master-0 kubenswrapper[7320]: I0312 12:21:26.315524 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:26.315607 master-0 kubenswrapper[7320]: I0312 12:21:26.315591 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: I0312 12:21:26.315782 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.315780 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.315857 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.315840522 +0000 UTC m=+16.874884403 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: I0312 12:21:26.316092 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: I0312 12:21:26.316150 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316141 7320 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316324 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.316284043 +0000 UTC m=+16.875327954 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316323 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316362 7320 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316444 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.316395776 +0000 UTC m=+16.875439857 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316527 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.316471337 +0000 UTC m=+16.875515258 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316447 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:26.317329 master-0 kubenswrapper[7320]: E0312 12:21:26.316595 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.31656988 +0000 UTC m=+16.875613801 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423071 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423271 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423367 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423440 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423619 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423810 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: I0312 12:21:26.423923 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.424193 7320 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.424316 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.424281323 +0000 UTC m=+16.983325244 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.425313 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.425417 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.42539084 +0000 UTC m=+16.984434761 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.425610 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.425694 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.425671497 +0000 UTC m=+16.984715418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.425833 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.425911 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.425891522 +0000 UTC m=+16.984935433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.426012 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.426145 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.426156 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.426118987 +0000 UTC m=+16.985162908 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.426273 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.426293 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.426265561 +0000 UTC m=+16.985309452 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:26.429745 master-0 kubenswrapper[7320]: E0312 12:21:26.426338 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.426321702 +0000 UTC m=+16.985365623 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:26.524864 master-0 kubenswrapper[7320]: I0312 12:21:26.524798 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:26.525277 master-0 kubenswrapper[7320]: E0312 12:21:26.525168 7320 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:26.525432 master-0 kubenswrapper[7320]: E0312 12:21:26.525391 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:34.525334587 +0000 UTC m=+17.084378628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:26.907627 master-0 kubenswrapper[7320]: I0312 12:21:26.907449 7320 generic.go:334] "Generic (PLEG): container finished" podID="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" containerID="5c4da145a949d5e5c70a918c24d5fac3234eb8326b6c012c2dbb76b6f559a57f" exitCode=0 Mar 12 12:21:26.907627 master-0 kubenswrapper[7320]: I0312 12:21:26.907527 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerDied","Data":"5c4da145a949d5e5c70a918c24d5fac3234eb8326b6c012c2dbb76b6f559a57f"} Mar 12 12:21:26.962713 master-0 kubenswrapper[7320]: I0312 12:21:26.962630 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:26.963216 master-0 kubenswrapper[7320]: I0312 12:21:26.963172 7320 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:21:26.963310 master-0 kubenswrapper[7320]: I0312 12:21:26.963254 7320 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:21:26.983437 master-0 kubenswrapper[7320]: I0312 12:21:26.983165 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:27.911332 master-0 kubenswrapper[7320]: I0312 12:21:27.911238 7320 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:21:28.184040 master-0 kubenswrapper[7320]: I0312 12:21:28.183936 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:28.187712 master-0 kubenswrapper[7320]: I0312 12:21:28.187679 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:28.925868 master-0 kubenswrapper[7320]: I0312 12:21:28.925791 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:21:29.047466 master-0 kubenswrapper[7320]: I0312 12:21:29.047396 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:29.047763 master-0 kubenswrapper[7320]: I0312 12:21:29.047663 7320 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:21:29.078101 master-0 kubenswrapper[7320]: I0312 12:21:29.078046 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:21:29.924646 master-0 kubenswrapper[7320]: I0312 12:21:29.924570 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerStarted","Data":"8ee43439e03e174fce129de95caef5dbf8392a0dcca1c8da1e1088570ad3efed"} Mar 12 12:21:33.939788 master-0 kubenswrapper[7320]: I0312 12:21:33.939419 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" event={"ID":"b5890f0c-cebe-4788-89f7-27568d875741","Type":"ContainerStarted","Data":"54c0c581483d3deef8c82f62f09c7eb9259f3f30693873246ec5154f0dcb5178"} Mar 12 12:21:34.332912 master-0 kubenswrapper[7320]: I0312 12:21:34.332819 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:34.333210 master-0 kubenswrapper[7320]: I0312 12:21:34.332924 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:34.333210 master-0 kubenswrapper[7320]: E0312 12:21:34.333124 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:34.333378 master-0 kubenswrapper[7320]: E0312 12:21:34.333165 7320 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:34.333532 master-0 kubenswrapper[7320]: E0312 12:21:34.333317 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.333275902 +0000 UTC m=+32.892319813 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "performance-addon-operator-webhook-cert" not found Mar 12 12:21:34.333665 master-0 kubenswrapper[7320]: I0312 12:21:34.333514 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:34.333665 master-0 kubenswrapper[7320]: E0312 12:21:34.333585 7320 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 12 12:21:34.333665 master-0 kubenswrapper[7320]: E0312 12:21:34.333627 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls podName:f3f295ac-7bc7-43b7-bd30-db82e7f16cd7 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.333584631 +0000 UTC m=+32.892628522 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls") pod "dns-operator-589895fbb7-l8x6p" (UID: "f3f295ac-7bc7-43b7-bd30-db82e7f16cd7") : secret "metrics-tls" not found Mar 12 12:21:34.333665 master-0 kubenswrapper[7320]: I0312 12:21:34.333592 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:34.334032 master-0 kubenswrapper[7320]: E0312 12:21:34.333685 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls podName:a22189f2-3f35-4ea6-9892-39a1b46637e2 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.333656423 +0000 UTC m=+32.892700344 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls") pod "ingress-operator-677db989d6-vpss8" (UID: "a22189f2-3f35-4ea6-9892-39a1b46637e2") : secret "metrics-tls" not found Mar 12 12:21:34.334032 master-0 kubenswrapper[7320]: E0312 12:21:34.333720 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:34.334032 master-0 kubenswrapper[7320]: E0312 12:21:34.333788 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.333766796 +0000 UTC m=+32.892810717 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:34.334032 master-0 kubenswrapper[7320]: I0312 12:21:34.333828 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:34.334032 master-0 kubenswrapper[7320]: E0312 12:21:34.333979 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:34.334363 master-0 kubenswrapper[7320]: E0312 12:21:34.334042 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.334026644 +0000 UTC m=+32.893070555 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:34.435443 master-0 kubenswrapper[7320]: I0312 12:21:34.435328 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:34.435760 master-0 kubenswrapper[7320]: E0312 12:21:34.435716 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:34.435886 master-0 kubenswrapper[7320]: I0312 12:21:34.435797 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:34.435960 master-0 kubenswrapper[7320]: E0312 12:21:34.435897 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.435826696 +0000 UTC m=+32.994870617 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:34.435960 master-0 kubenswrapper[7320]: I0312 12:21:34.435939 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:34.436089 master-0 kubenswrapper[7320]: E0312 12:21:34.436020 7320 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 12 12:21:34.436150 master-0 kubenswrapper[7320]: E0312 12:21:34.436121 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.436089234 +0000 UTC m=+32.995133145 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : secret "image-registry-operator-tls" not found Mar 12 12:21:34.436301 master-0 kubenswrapper[7320]: E0312 12:21:34.436227 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:34.436373 master-0 kubenswrapper[7320]: I0312 12:21:34.436252 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:34.436438 master-0 kubenswrapper[7320]: E0312 12:21:34.436363 7320 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 12 12:21:34.436438 master-0 kubenswrapper[7320]: E0312 12:21:34.436363 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.436333761 +0000 UTC m=+32.995377672 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:34.436616 master-0 kubenswrapper[7320]: I0312 12:21:34.436518 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:34.436616 master-0 kubenswrapper[7320]: E0312 12:21:34.436566 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:34.436616 master-0 kubenswrapper[7320]: I0312 12:21:34.436599 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:34.436805 master-0 kubenswrapper[7320]: E0312 12:21:34.436632 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.436603499 +0000 UTC m=+32.995647460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : secret "node-tuning-operator-tls" not found Mar 12 12:21:34.436805 master-0 kubenswrapper[7320]: E0312 12:21:34.436685 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:34.436805 master-0 kubenswrapper[7320]: E0312 12:21:34.436701 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.436681841 +0000 UTC m=+32.995725872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:34.436805 master-0 kubenswrapper[7320]: E0312 12:21:34.436744 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.436722882 +0000 UTC m=+32.995766903 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:34.436805 master-0 kubenswrapper[7320]: I0312 12:21:34.436784 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:34.437112 master-0 kubenswrapper[7320]: E0312 12:21:34.436931 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:34.437112 master-0 kubenswrapper[7320]: E0312 12:21:34.437007 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.43698631 +0000 UTC m=+32.996030241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:34.537896 master-0 kubenswrapper[7320]: I0312 12:21:34.537810 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:34.538203 master-0 kubenswrapper[7320]: E0312 12:21:34.538045 7320 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:34.538306 master-0 kubenswrapper[7320]: E0312 12:21:34.538228 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert podName:8e6f7496-1047-482d-9203-ff83a9eb7d93 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:50.538200776 +0000 UTC m=+33.097244687 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert") pod "cluster-version-operator-745944c6b7-b2t49" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93") : secret "cluster-version-operator-serving-cert" not found Mar 12 12:21:35.811417 master-0 kubenswrapper[7320]: I0312 12:21:35.810997 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4"] Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: E0312 12:21:35.811471 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: I0312 12:21:35.811529 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: E0312 12:21:35.811546 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: I0312 12:21:35.811557 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: I0312 12:21:35.811676 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: I0312 12:21:35.811690 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:21:35.812726 master-0 kubenswrapper[7320]: I0312 12:21:35.812073 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:35.816268 master-0 kubenswrapper[7320]: I0312 12:21:35.816211 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:21:35.816268 master-0 kubenswrapper[7320]: I0312 12:21:35.816267 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:21:35.816268 master-0 kubenswrapper[7320]: I0312 12:21:35.816272 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:21:35.816654 master-0 kubenswrapper[7320]: I0312 12:21:35.816219 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:21:35.816654 master-0 kubenswrapper[7320]: I0312 12:21:35.816541 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:21:35.816654 master-0 kubenswrapper[7320]: I0312 12:21:35.816588 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:21:35.860393 master-0 kubenswrapper[7320]: I0312 12:21:35.860335 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4"] Mar 12 12:21:35.947262 master-0 kubenswrapper[7320]: I0312 12:21:35.947188 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xqmw9" event={"ID":"54612733-158f-4a92-a1bf-f4a8d653ffaf","Type":"ContainerStarted","Data":"ac3f7d690b2971ad018b71c2c20c4ab75403757ec0322a7835a7b176375157a2"} Mar 12 12:21:35.948393 master-0 kubenswrapper[7320]: I0312 12:21:35.948354 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" event={"ID":"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee","Type":"ContainerStarted","Data":"78cb426b98a54442332ae7dea069dbb75e6d07a8377b812d61f3d58bf1a33d17"} Mar 12 12:21:35.949818 master-0 kubenswrapper[7320]: I0312 12:21:35.949787 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" event={"ID":"55bf535c-93ab-4870-a9d2-c02496d71ef0","Type":"ContainerStarted","Data":"6a3b8be971ca63800f0532603d6fc3d806dc294fa7565d4894a90520eb420540"} Mar 12 12:21:35.950855 master-0 kubenswrapper[7320]: I0312 12:21:35.950816 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" event={"ID":"5a012d0b-d1a8-4cd3-8b91-b346d0445f24","Type":"ContainerStarted","Data":"34dc0209ff1d7a04ba94a94ac3d8ef4ae5cf2a5dd4532cb0a823e435a74047ac"} Mar 12 12:21:35.954416 master-0 kubenswrapper[7320]: I0312 12:21:35.954360 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:35.954416 master-0 kubenswrapper[7320]: I0312 12:21:35.954417 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwrjw\" (UniqueName: \"kubernetes.io/projected/d41a4853-d8fd-40af-8e36-7e3ca185316b-kube-api-access-jwrjw\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:35.954632 master-0 kubenswrapper[7320]: I0312 12:21:35.954471 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:35.954632 master-0 kubenswrapper[7320]: I0312 12:21:35.954583 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:35.954632 master-0 kubenswrapper[7320]: I0312 12:21:35.954607 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: I0312 12:21:36.055492 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: I0312 12:21:36.055564 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwrjw\" (UniqueName: \"kubernetes.io/projected/d41a4853-d8fd-40af-8e36-7e3ca185316b-kube-api-access-jwrjw\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: I0312 12:21:36.055619 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: I0312 12:21:36.055750 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: I0312 12:21:36.055777 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: E0312 12:21:36.056649 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: E0312 12:21:36.056713 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:36.556696255 +0000 UTC m=+19.115740136 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "config" not found Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: E0312 12:21:36.056706 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 12 12:21:36.057373 master-0 kubenswrapper[7320]: E0312 12:21:36.056824 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:36.556796348 +0000 UTC m=+19.115840229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "openshift-global-ca" not found Mar 12 12:21:36.057884 master-0 kubenswrapper[7320]: E0312 12:21:36.057467 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:36.057884 master-0 kubenswrapper[7320]: E0312 12:21:36.057553 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:36.55754157 +0000 UTC m=+19.116585651 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "client-ca" not found Mar 12 12:21:36.058047 master-0 kubenswrapper[7320]: E0312 12:21:36.058033 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:36.058104 master-0 kubenswrapper[7320]: E0312 12:21:36.058067 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:36.558058155 +0000 UTC m=+19.117102036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : secret "serving-cert" not found Mar 12 12:21:36.107067 master-0 kubenswrapper[7320]: I0312 12:21:36.106994 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwrjw\" (UniqueName: \"kubernetes.io/projected/d41a4853-d8fd-40af-8e36-7e3ca185316b-kube-api-access-jwrjw\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.563695 master-0 kubenswrapper[7320]: I0312 12:21:36.563267 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.563695 master-0 kubenswrapper[7320]: I0312 12:21:36.563605 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.563695 master-0 kubenswrapper[7320]: I0312 12:21:36.563686 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.564043 master-0 kubenswrapper[7320]: I0312 12:21:36.563714 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:36.564275 master-0 kubenswrapper[7320]: E0312 12:21:36.563509 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 12 12:21:36.564552 master-0 kubenswrapper[7320]: E0312 12:21:36.564536 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.564462575 +0000 UTC m=+20.123506456 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "config" not found Mar 12 12:21:36.565393 master-0 kubenswrapper[7320]: E0312 12:21:36.565376 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:36.565538 master-0 kubenswrapper[7320]: E0312 12:21:36.565524 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.565505475 +0000 UTC m=+20.124549356 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : secret "serving-cert" not found Mar 12 12:21:36.565702 master-0 kubenswrapper[7320]: E0312 12:21:36.565687 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:36.565806 master-0 kubenswrapper[7320]: E0312 12:21:36.565796 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.565778453 +0000 UTC m=+20.124822334 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "client-ca" not found Mar 12 12:21:36.565923 master-0 kubenswrapper[7320]: E0312 12:21:36.565912 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 12 12:21:36.566012 master-0 kubenswrapper[7320]: E0312 12:21:36.566002 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.56599162 +0000 UTC m=+20.125035501 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "openshift-global-ca" not found Mar 12 12:21:36.581111 master-0 kubenswrapper[7320]: I0312 12:21:36.579966 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw"] Mar 12 12:21:36.581111 master-0 kubenswrapper[7320]: I0312 12:21:36.580447 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:21:36.595275 master-0 kubenswrapper[7320]: I0312 12:21:36.595214 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw"] Mar 12 12:21:36.767413 master-0 kubenswrapper[7320]: I0312 12:21:36.767350 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvrpf\" (UniqueName: \"kubernetes.io/projected/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef-kube-api-access-wvrpf\") pod \"csi-snapshot-controller-7577d6f48-kf7kw\" (UID: \"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:21:36.868860 master-0 kubenswrapper[7320]: I0312 12:21:36.868650 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrpf\" (UniqueName: \"kubernetes.io/projected/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef-kube-api-access-wvrpf\") pod \"csi-snapshot-controller-7577d6f48-kf7kw\" (UID: \"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:21:36.889333 master-0 kubenswrapper[7320]: I0312 12:21:36.889259 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrpf\" (UniqueName: \"kubernetes.io/projected/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef-kube-api-access-wvrpf\") pod \"csi-snapshot-controller-7577d6f48-kf7kw\" (UID: \"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:21:36.892972 master-0 kubenswrapper[7320]: I0312 12:21:36.892918 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:21:36.966519 master-0 kubenswrapper[7320]: I0312 12:21:36.962548 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" event={"ID":"0aeeef2a-f9df-4f87-b985-bd1da94c76c3","Type":"ContainerStarted","Data":"d357ccc688b993b9454b28bfd7fb28a5d58ecf020cbf9839477bf958a0d7b96f"} Mar 12 12:21:36.966519 master-0 kubenswrapper[7320]: I0312 12:21:36.963217 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4"] Mar 12 12:21:36.966519 master-0 kubenswrapper[7320]: E0312 12:21:36.963552 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" podUID="d41a4853-d8fd-40af-8e36-7e3ca185316b" Mar 12 12:21:36.966519 master-0 kubenswrapper[7320]: I0312 12:21:36.966085 7320 generic.go:334] "Generic (PLEG): container finished" podID="a154f648-b96d-449e-b0f5-ba32266000c2" containerID="4200e813be631eda72870daef6f5fca8205e93c466ebc5b8aa512a46303fa669" exitCode=0 Mar 12 12:21:36.966519 master-0 kubenswrapper[7320]: I0312 12:21:36.966159 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerDied","Data":"4200e813be631eda72870daef6f5fca8205e93c466ebc5b8aa512a46303fa669"} Mar 12 12:21:36.968638 master-0 kubenswrapper[7320]: I0312 12:21:36.968575 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" event={"ID":"ea80247e-b4dd-45dc-8255-6e68508c8480","Type":"ContainerStarted","Data":"fc4b4e674883cb3e17ae8f6229f477c4fc095a1d76196bd6eee19ed1d8bb25c9"} Mar 12 12:21:36.970511 master-0 kubenswrapper[7320]: I0312 12:21:36.970454 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb"] Mar 12 12:21:36.971648 master-0 kubenswrapper[7320]: I0312 12:21:36.971025 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:36.978505 master-0 kubenswrapper[7320]: I0312 12:21:36.974579 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 12:21:36.978505 master-0 kubenswrapper[7320]: I0312 12:21:36.974608 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 12:21:36.978505 master-0 kubenswrapper[7320]: I0312 12:21:36.974579 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 12:21:36.978505 master-0 kubenswrapper[7320]: I0312 12:21:36.974827 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 12:21:36.978505 master-0 kubenswrapper[7320]: I0312 12:21:36.977597 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 12:21:36.982052 master-0 kubenswrapper[7320]: I0312 12:21:36.981761 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb"] Mar 12 12:21:37.073510 master-0 kubenswrapper[7320]: I0312 12:21:37.072678 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.073510 master-0 kubenswrapper[7320]: I0312 12:21:37.073101 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.073510 master-0 kubenswrapper[7320]: I0312 12:21:37.073173 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wll8q\" (UniqueName: \"kubernetes.io/projected/98d6d8ce-3a64-4c9d-956a-6325a877d871-kube-api-access-wll8q\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.073510 master-0 kubenswrapper[7320]: I0312 12:21:37.073223 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-config\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.136907 master-0 kubenswrapper[7320]: I0312 12:21:37.136768 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw"] Mar 12 12:21:37.176035 master-0 kubenswrapper[7320]: I0312 12:21:37.175965 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-config\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.176201 master-0 kubenswrapper[7320]: I0312 12:21:37.176166 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.176463 master-0 kubenswrapper[7320]: I0312 12:21:37.176432 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.176713 master-0 kubenswrapper[7320]: I0312 12:21:37.176644 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wll8q\" (UniqueName: \"kubernetes.io/projected/98d6d8ce-3a64-4c9d-956a-6325a877d871-kube-api-access-wll8q\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.178547 master-0 kubenswrapper[7320]: I0312 12:21:37.178457 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-config\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.178920 master-0 kubenswrapper[7320]: E0312 12:21:37.178876 7320 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:37.178984 master-0 kubenswrapper[7320]: E0312 12:21:37.178957 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.678936822 +0000 UTC m=+20.237980693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : secret "serving-cert" not found Mar 12 12:21:37.181035 master-0 kubenswrapper[7320]: E0312 12:21:37.178894 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:37.181035 master-0 kubenswrapper[7320]: E0312 12:21:37.179431 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:37.679380135 +0000 UTC m=+20.238424016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : configmap "client-ca" not found Mar 12 12:21:37.200590 master-0 kubenswrapper[7320]: I0312 12:21:37.195387 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wll8q\" (UniqueName: \"kubernetes.io/projected/98d6d8ce-3a64-4c9d-956a-6325a877d871-kube-api-access-wll8q\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.588512 master-0 kubenswrapper[7320]: I0312 12:21:37.588283 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.588512 master-0 kubenswrapper[7320]: I0312 12:21:37.588428 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.588787 master-0 kubenswrapper[7320]: I0312 12:21:37.588678 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.588787 master-0 kubenswrapper[7320]: I0312 12:21:37.588713 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.590538 master-0 kubenswrapper[7320]: E0312 12:21:37.588857 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:37.590538 master-0 kubenswrapper[7320]: E0312 12:21:37.588957 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:39.588934987 +0000 UTC m=+22.147978868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : configmap "client-ca" not found Mar 12 12:21:37.590538 master-0 kubenswrapper[7320]: E0312 12:21:37.589584 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:37.590538 master-0 kubenswrapper[7320]: E0312 12:21:37.589753 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert podName:d41a4853-d8fd-40af-8e36-7e3ca185316b nodeName:}" failed. No retries permitted until 2026-03-12 12:21:39.589608536 +0000 UTC m=+22.148652417 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert") pod "controller-manager-6f7fd6c796-d8cn4" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b") : secret "serving-cert" not found Mar 12 12:21:37.593540 master-0 kubenswrapper[7320]: I0312 12:21:37.592623 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.595017 master-0 kubenswrapper[7320]: I0312 12:21:37.594889 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-d8cn4\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.689975 master-0 kubenswrapper[7320]: I0312 12:21:37.689818 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.689975 master-0 kubenswrapper[7320]: I0312 12:21:37.689947 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:37.690175 master-0 kubenswrapper[7320]: E0312 12:21:37.690088 7320 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:37.690175 master-0 kubenswrapper[7320]: E0312 12:21:37.690162 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:38.690140352 +0000 UTC m=+21.249184233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : secret "serving-cert" not found Mar 12 12:21:37.690282 master-0 kubenswrapper[7320]: E0312 12:21:37.690239 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:37.690364 master-0 kubenswrapper[7320]: E0312 12:21:37.690343 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:38.690317628 +0000 UTC m=+21.249361509 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : configmap "client-ca" not found Mar 12 12:21:37.973007 master-0 kubenswrapper[7320]: I0312 12:21:37.972944 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" event={"ID":"ab087440-bdf2-4e2f-9a5a-434d50a2329a","Type":"ContainerStarted","Data":"c10ad00e6d5ca94dd8aee1068bcae9a35fd5744bbc9fa9703850b00e0063db31"} Mar 12 12:21:37.975143 master-0 kubenswrapper[7320]: I0312 12:21:37.974866 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.975416 master-0 kubenswrapper[7320]: I0312 12:21:37.975387 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerStarted","Data":"64f1dc2d81d717bf0671e44d8f5d716029eb32e5b7d761c1919a015da1e533b6"} Mar 12 12:21:37.981608 master-0 kubenswrapper[7320]: I0312 12:21:37.981558 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:37.994706 master-0 kubenswrapper[7320]: I0312 12:21:37.993118 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwrjw\" (UniqueName: \"kubernetes.io/projected/d41a4853-d8fd-40af-8e36-7e3ca185316b-kube-api-access-jwrjw\") pod \"d41a4853-d8fd-40af-8e36-7e3ca185316b\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " Mar 12 12:21:37.994706 master-0 kubenswrapper[7320]: I0312 12:21:37.993169 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") pod \"d41a4853-d8fd-40af-8e36-7e3ca185316b\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " Mar 12 12:21:37.994706 master-0 kubenswrapper[7320]: I0312 12:21:37.993216 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") pod \"d41a4853-d8fd-40af-8e36-7e3ca185316b\" (UID: \"d41a4853-d8fd-40af-8e36-7e3ca185316b\") " Mar 12 12:21:37.994995 master-0 kubenswrapper[7320]: I0312 12:21:37.994826 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "d41a4853-d8fd-40af-8e36-7e3ca185316b" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:37.995459 master-0 kubenswrapper[7320]: I0312 12:21:37.995300 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config" (OuterVolumeSpecName: "config") pod "d41a4853-d8fd-40af-8e36-7e3ca185316b" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:38.003177 master-0 kubenswrapper[7320]: I0312 12:21:38.002947 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d41a4853-d8fd-40af-8e36-7e3ca185316b-kube-api-access-jwrjw" (OuterVolumeSpecName: "kube-api-access-jwrjw") pod "d41a4853-d8fd-40af-8e36-7e3ca185316b" (UID: "d41a4853-d8fd-40af-8e36-7e3ca185316b"). InnerVolumeSpecName "kube-api-access-jwrjw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:21:38.094302 master-0 kubenswrapper[7320]: I0312 12:21:38.094241 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwrjw\" (UniqueName: \"kubernetes.io/projected/d41a4853-d8fd-40af-8e36-7e3ca185316b-kube-api-access-jwrjw\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:38.094302 master-0 kubenswrapper[7320]: I0312 12:21:38.094275 7320 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:38.094302 master-0 kubenswrapper[7320]: I0312 12:21:38.094284 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:38.398255 master-0 kubenswrapper[7320]: I0312 12:21:38.398122 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp"] Mar 12 12:21:38.398833 master-0 kubenswrapper[7320]: I0312 12:21:38.398813 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:21:38.404073 master-0 kubenswrapper[7320]: I0312 12:21:38.404032 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 12:21:38.404227 master-0 kubenswrapper[7320]: I0312 12:21:38.404087 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 12:21:38.420014 master-0 kubenswrapper[7320]: I0312 12:21:38.419966 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp"] Mar 12 12:21:38.499044 master-0 kubenswrapper[7320]: I0312 12:21:38.498901 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf74f\" (UniqueName: \"kubernetes.io/projected/580bafd6-af8c-4961-b959-b736a180e309-kube-api-access-lf74f\") pod \"migrator-57ccdf9b5-4rjrp\" (UID: \"580bafd6-af8c-4961-b959-b736a180e309\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:21:38.600539 master-0 kubenswrapper[7320]: I0312 12:21:38.600201 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf74f\" (UniqueName: \"kubernetes.io/projected/580bafd6-af8c-4961-b959-b736a180e309-kube-api-access-lf74f\") pod \"migrator-57ccdf9b5-4rjrp\" (UID: \"580bafd6-af8c-4961-b959-b736a180e309\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:21:38.623048 master-0 kubenswrapper[7320]: I0312 12:21:38.622969 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf74f\" (UniqueName: \"kubernetes.io/projected/580bafd6-af8c-4961-b959-b736a180e309-kube-api-access-lf74f\") pod \"migrator-57ccdf9b5-4rjrp\" (UID: \"580bafd6-af8c-4961-b959-b736a180e309\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:21:38.700649 master-0 kubenswrapper[7320]: I0312 12:21:38.700600 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:38.700900 master-0 kubenswrapper[7320]: I0312 12:21:38.700666 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:38.700952 master-0 kubenswrapper[7320]: E0312 12:21:38.700869 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:38.701092 master-0 kubenswrapper[7320]: E0312 12:21:38.701049 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:40.700979814 +0000 UTC m=+23.260023915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : configmap "client-ca" not found Mar 12 12:21:38.701092 master-0 kubenswrapper[7320]: E0312 12:21:38.701063 7320 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:38.701186 master-0 kubenswrapper[7320]: E0312 12:21:38.701112 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:40.701103527 +0000 UTC m=+23.260147648 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : secret "serving-cert" not found Mar 12 12:21:38.737684 master-0 kubenswrapper[7320]: I0312 12:21:38.737624 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:21:38.800137 master-0 kubenswrapper[7320]: I0312 12:21:38.800021 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-gxx99"] Mar 12 12:21:38.801119 master-0 kubenswrapper[7320]: I0312 12:21:38.800634 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:38.803196 master-0 kubenswrapper[7320]: I0312 12:21:38.803146 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 12:21:38.803807 master-0 kubenswrapper[7320]: I0312 12:21:38.803403 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 12:21:38.803807 master-0 kubenswrapper[7320]: I0312 12:21:38.803168 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 12:21:38.803807 master-0 kubenswrapper[7320]: I0312 12:21:38.803667 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 12:21:38.810049 master-0 kubenswrapper[7320]: I0312 12:21:38.809873 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-gxx99"] Mar 12 12:21:38.903096 master-0 kubenswrapper[7320]: I0312 12:21:38.903013 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7tct\" (UniqueName: \"kubernetes.io/projected/15bf86d9-62b3-4af8-b6f6-23131d712332-kube-api-access-g7tct\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:38.903300 master-0 kubenswrapper[7320]: I0312 12:21:38.903219 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-key\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:38.903437 master-0 kubenswrapper[7320]: I0312 12:21:38.903394 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:38.983161 master-0 kubenswrapper[7320]: I0312 12:21:38.983033 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4" Mar 12 12:21:39.005586 master-0 kubenswrapper[7320]: I0312 12:21:39.005535 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-key\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.005682 master-0 kubenswrapper[7320]: I0312 12:21:39.005633 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.005976 master-0 kubenswrapper[7320]: I0312 12:21:39.005937 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tct\" (UniqueName: \"kubernetes.io/projected/15bf86d9-62b3-4af8-b6f6-23131d712332-kube-api-access-g7tct\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.006897 master-0 kubenswrapper[7320]: I0312 12:21:39.006868 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.010131 master-0 kubenswrapper[7320]: I0312 12:21:39.010092 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-key\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.033904 master-0 kubenswrapper[7320]: I0312 12:21:39.033866 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4"] Mar 12 12:21:39.037285 master-0 kubenswrapper[7320]: I0312 12:21:39.037240 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tct\" (UniqueName: \"kubernetes.io/projected/15bf86d9-62b3-4af8-b6f6-23131d712332-kube-api-access-g7tct\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.045150 master-0 kubenswrapper[7320]: I0312 12:21:39.045080 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-d8cn4"] Mar 12 12:21:39.108202 master-0 kubenswrapper[7320]: I0312 12:21:39.108146 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d41a4853-d8fd-40af-8e36-7e3ca185316b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:39.108400 master-0 kubenswrapper[7320]: I0312 12:21:39.108285 7320 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d41a4853-d8fd-40af-8e36-7e3ca185316b-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:39.130872 master-0 kubenswrapper[7320]: I0312 12:21:39.130793 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:21:39.773799 master-0 kubenswrapper[7320]: I0312 12:21:39.773219 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d41a4853-d8fd-40af-8e36-7e3ca185316b" path="/var/lib/kubelet/pods/d41a4853-d8fd-40af-8e36-7e3ca185316b/volumes" Mar 12 12:21:39.846116 master-0 kubenswrapper[7320]: I0312 12:21:39.846047 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-65bb95677d-rqwcb"] Mar 12 12:21:39.847029 master-0 kubenswrapper[7320]: I0312 12:21:39.846888 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:39.849035 master-0 kubenswrapper[7320]: I0312 12:21:39.848988 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:21:39.849594 master-0 kubenswrapper[7320]: I0312 12:21:39.849560 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:21:39.850007 master-0 kubenswrapper[7320]: I0312 12:21:39.849968 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:21:39.851003 master-0 kubenswrapper[7320]: I0312 12:21:39.850974 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-65bb95677d-rqwcb"] Mar 12 12:21:39.851172 master-0 kubenswrapper[7320]: I0312 12:21:39.851153 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:21:39.851172 master-0 kubenswrapper[7320]: I0312 12:21:39.851162 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:21:39.856101 master-0 kubenswrapper[7320]: I0312 12:21:39.856067 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:21:39.918470 master-0 kubenswrapper[7320]: I0312 12:21:39.918430 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:39.918470 master-0 kubenswrapper[7320]: I0312 12:21:39.918468 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-proxy-ca-bundles\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:39.918674 master-0 kubenswrapper[7320]: I0312 12:21:39.918517 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-config\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:39.918674 master-0 kubenswrapper[7320]: I0312 12:21:39.918544 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:39.918674 master-0 kubenswrapper[7320]: I0312 12:21:39.918590 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnb5q\" (UniqueName: \"kubernetes.io/projected/33619450-d04b-4cb7-999b-2b3631fc3e91-kube-api-access-dnb5q\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.020227 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-config\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.020291 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.020372 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnb5q\" (UniqueName: \"kubernetes.io/projected/33619450-d04b-4cb7-999b-2b3631fc3e91-kube-api-access-dnb5q\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.020510 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.020535 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-proxy-ca-bundles\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.022031 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-proxy-ca-bundles\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: I0312 12:21:40.022885 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-config\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: E0312 12:21:40.022980 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: E0312 12:21:40.023033 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:40.523015305 +0000 UTC m=+23.082059196 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : secret "serving-cert" not found Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: E0312 12:21:40.023900 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:40.024000 master-0 kubenswrapper[7320]: E0312 12:21:40.024002 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:40.523976644 +0000 UTC m=+23.083020525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : configmap "client-ca" not found Mar 12 12:21:40.064768 master-0 kubenswrapper[7320]: I0312 12:21:40.064705 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnb5q\" (UniqueName: \"kubernetes.io/projected/33619450-d04b-4cb7-999b-2b3631fc3e91-kube-api-access-dnb5q\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.135981 master-0 kubenswrapper[7320]: I0312 12:21:40.133219 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp"] Mar 12 12:21:40.138922 master-0 kubenswrapper[7320]: W0312 12:21:40.138884 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod580bafd6_af8c_4961_b959_b736a180e309.slice/crio-9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31 WatchSource:0}: Error finding container 9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31: Status 404 returned error can't find the container with id 9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31 Mar 12 12:21:40.181174 master-0 kubenswrapper[7320]: I0312 12:21:40.181114 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-gxx99"] Mar 12 12:21:40.493788 master-0 kubenswrapper[7320]: I0312 12:21:40.493729 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p"] Mar 12 12:21:40.494454 master-0 kubenswrapper[7320]: I0312 12:21:40.494410 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.522952 master-0 kubenswrapper[7320]: I0312 12:21:40.522906 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 12 12:21:40.523405 master-0 kubenswrapper[7320]: I0312 12:21:40.523355 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 12 12:21:40.523627 master-0 kubenswrapper[7320]: I0312 12:21:40.523360 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 12 12:21:40.525910 master-0 kubenswrapper[7320]: I0312 12:21:40.525865 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p"] Mar 12 12:21:40.528782 master-0 kubenswrapper[7320]: I0312 12:21:40.528712 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.528896 master-0 kubenswrapper[7320]: I0312 12:21:40.528816 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.528896 master-0 kubenswrapper[7320]: I0312 12:21:40.528871 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.528966 master-0 kubenswrapper[7320]: I0312 12:21:40.528911 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.528966 master-0 kubenswrapper[7320]: I0312 12:21:40.528933 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:40.529022 master-0 kubenswrapper[7320]: I0312 12:21:40.528965 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wj8x\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-kube-api-access-6wj8x\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.529022 master-0 kubenswrapper[7320]: I0312 12:21:40.528985 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.529975 master-0 kubenswrapper[7320]: E0312 12:21:40.529160 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:40.529975 master-0 kubenswrapper[7320]: E0312 12:21:40.529211 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:41.529195829 +0000 UTC m=+24.088239710 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : configmap "client-ca" not found Mar 12 12:21:40.529975 master-0 kubenswrapper[7320]: E0312 12:21:40.529655 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:40.529975 master-0 kubenswrapper[7320]: E0312 12:21:40.529721 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:41.529697834 +0000 UTC m=+24.088741715 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : secret "serving-cert" not found Mar 12 12:21:40.609024 master-0 kubenswrapper[7320]: I0312 12:21:40.608911 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7"] Mar 12 12:21:40.609669 master-0 kubenswrapper[7320]: I0312 12:21:40.609634 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.611499 master-0 kubenswrapper[7320]: I0312 12:21:40.611420 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 12 12:21:40.612500 master-0 kubenswrapper[7320]: I0312 12:21:40.612447 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 12 12:21:40.612964 master-0 kubenswrapper[7320]: I0312 12:21:40.612928 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 12 12:21:40.617332 master-0 kubenswrapper[7320]: I0312 12:21:40.617283 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 12 12:21:40.621940 master-0 kubenswrapper[7320]: I0312 12:21:40.621891 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7"] Mar 12 12:21:40.630066 master-0 kubenswrapper[7320]: I0312 12:21:40.629951 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/022dd526-0ea5-4224-9d2e-778ed4ef8a56-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.630273 master-0 kubenswrapper[7320]: I0312 12:21:40.630104 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630273 master-0 kubenswrapper[7320]: I0312 12:21:40.630146 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.630273 master-0 kubenswrapper[7320]: I0312 12:21:40.630170 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630415 master-0 kubenswrapper[7320]: I0312 12:21:40.630364 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.630461 master-0 kubenswrapper[7320]: I0312 12:21:40.630416 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxrnf\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-kube-api-access-pxrnf\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.630461 master-0 kubenswrapper[7320]: I0312 12:21:40.630449 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.630595 master-0 kubenswrapper[7320]: I0312 12:21:40.630499 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630595 master-0 kubenswrapper[7320]: I0312 12:21:40.630541 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.630595 master-0 kubenswrapper[7320]: I0312 12:21:40.630575 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wj8x\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-kube-api-access-6wj8x\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630712 master-0 kubenswrapper[7320]: I0312 12:21:40.630602 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630755 master-0 kubenswrapper[7320]: I0312 12:21:40.630699 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630800 master-0 kubenswrapper[7320]: I0312 12:21:40.630724 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.630916 master-0 kubenswrapper[7320]: I0312 12:21:40.630872 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.639503 master-0 kubenswrapper[7320]: I0312 12:21:40.637358 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.667874 master-0 kubenswrapper[7320]: I0312 12:21:40.666827 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wj8x\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-kube-api-access-6wj8x\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:40.731803 master-0 kubenswrapper[7320]: I0312 12:21:40.731726 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:40.731803 master-0 kubenswrapper[7320]: I0312 12:21:40.731801 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/022dd526-0ea5-4224-9d2e-778ed4ef8a56-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732059 master-0 kubenswrapper[7320]: E0312 12:21:40.731878 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:40.732059 master-0 kubenswrapper[7320]: E0312 12:21:40.731951 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:44.73193359 +0000 UTC m=+27.290977471 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : configmap "client-ca" not found Mar 12 12:21:40.732059 master-0 kubenswrapper[7320]: I0312 12:21:40.732004 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:40.732059 master-0 kubenswrapper[7320]: I0312 12:21:40.732043 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732245 master-0 kubenswrapper[7320]: E0312 12:21:40.732199 7320 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:40.732287 master-0 kubenswrapper[7320]: I0312 12:21:40.732244 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732345 master-0 kubenswrapper[7320]: E0312 12:21:40.732300 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:44.73227281 +0000 UTC m=+27.291316691 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : secret "serving-cert" not found Mar 12 12:21:40.732532 master-0 kubenswrapper[7320]: I0312 12:21:40.732459 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrnf\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-kube-api-access-pxrnf\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732532 master-0 kubenswrapper[7320]: I0312 12:21:40.732509 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732646 master-0 kubenswrapper[7320]: E0312 12:21:40.732549 7320 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 12 12:21:40.732646 master-0 kubenswrapper[7320]: E0312 12:21:40.732607 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs podName:022dd526-0ea5-4224-9d2e-778ed4ef8a56 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:41.232589979 +0000 UTC m=+23.791633950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-pqph7" (UID: "022dd526-0ea5-4224-9d2e-778ed4ef8a56") : secret "catalogserver-cert" not found Mar 12 12:21:40.732646 master-0 kubenswrapper[7320]: I0312 12:21:40.732630 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732755 master-0 kubenswrapper[7320]: I0312 12:21:40.732681 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.732883 master-0 kubenswrapper[7320]: I0312 12:21:40.732847 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.733168 master-0 kubenswrapper[7320]: I0312 12:21:40.733130 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/022dd526-0ea5-4224-9d2e-778ed4ef8a56-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.735402 master-0 kubenswrapper[7320]: I0312 12:21:40.735360 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.748912 master-0 kubenswrapper[7320]: I0312 12:21:40.748861 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrnf\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-kube-api-access-pxrnf\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:40.827359 master-0 kubenswrapper[7320]: I0312 12:21:40.827293 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:41.000665 master-0 kubenswrapper[7320]: I0312 12:21:41.000223 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p"] Mar 12 12:21:41.000872 master-0 kubenswrapper[7320]: I0312 12:21:41.000676 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerStarted","Data":"7085fb6154550cf5e6b3b06a657089d39dcfd14f6f8134eed107ad674b6e5b62"} Mar 12 12:21:41.003584 master-0 kubenswrapper[7320]: I0312 12:21:41.003383 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerStarted","Data":"962d17c59c83032444283250e4b6771dc0674c33ed005bbb62fede3e00c87666"} Mar 12 12:21:41.003584 master-0 kubenswrapper[7320]: I0312 12:21:41.003573 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:41.007509 master-0 kubenswrapper[7320]: I0312 12:21:41.004613 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" event={"ID":"580bafd6-af8c-4961-b959-b736a180e309","Type":"ContainerStarted","Data":"9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31"} Mar 12 12:21:41.007509 master-0 kubenswrapper[7320]: I0312 12:21:41.005948 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" event={"ID":"15bf86d9-62b3-4af8-b6f6-23131d712332","Type":"ContainerStarted","Data":"dccd5d7d429a30ef17453995a9a86b8815c101400dca3ce09b7103fbd7a3f58c"} Mar 12 12:21:41.007509 master-0 kubenswrapper[7320]: I0312 12:21:41.005971 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" event={"ID":"15bf86d9-62b3-4af8-b6f6-23131d712332","Type":"ContainerStarted","Data":"9ed5d4e2f5f9d30b72c838a61c437059eadf2c2990c7efa7688884bd954ef475"} Mar 12 12:21:41.015835 master-0 kubenswrapper[7320]: I0312 12:21:41.014661 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podStartSLOduration=2.26912642 podStartE2EDuration="5.014639185s" podCreationTimestamp="2026-03-12 12:21:36 +0000 UTC" firstStartedPulling="2026-03-12 12:21:37.14815981 +0000 UTC m=+19.707203691" lastFinishedPulling="2026-03-12 12:21:39.893672565 +0000 UTC m=+22.452716456" observedRunningTime="2026-03-12 12:21:41.013933434 +0000 UTC m=+23.572977325" watchObservedRunningTime="2026-03-12 12:21:41.014639185 +0000 UTC m=+23.573683066" Mar 12 12:21:41.027608 master-0 kubenswrapper[7320]: I0312 12:21:41.027515 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" podStartSLOduration=3.02745619 podStartE2EDuration="3.02745619s" podCreationTimestamp="2026-03-12 12:21:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:41.02607629 +0000 UTC m=+23.585120171" watchObservedRunningTime="2026-03-12 12:21:41.02745619 +0000 UTC m=+23.586500081" Mar 12 12:21:41.244546 master-0 kubenswrapper[7320]: I0312 12:21:41.241711 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:41.244546 master-0 kubenswrapper[7320]: E0312 12:21:41.241869 7320 secret.go:189] Couldn't get secret openshift-catalogd/catalogserver-cert: secret "catalogserver-cert" not found Mar 12 12:21:41.244546 master-0 kubenswrapper[7320]: E0312 12:21:41.241917 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs podName:022dd526-0ea5-4224-9d2e-778ed4ef8a56 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:42.241903224 +0000 UTC m=+24.800947105 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "catalogserver-certs" (UniqueName: "kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs") pod "catalogd-controller-manager-7f8b8b6f4c-pqph7" (UID: "022dd526-0ea5-4224-9d2e-778ed4ef8a56") : secret "catalogserver-cert" not found Mar 12 12:21:41.554501 master-0 kubenswrapper[7320]: I0312 12:21:41.551137 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:41.554501 master-0 kubenswrapper[7320]: I0312 12:21:41.551448 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:41.554501 master-0 kubenswrapper[7320]: E0312 12:21:41.551592 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:41.554501 master-0 kubenswrapper[7320]: E0312 12:21:41.551638 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:43.551624411 +0000 UTC m=+26.110668292 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : secret "serving-cert" not found Mar 12 12:21:41.554501 master-0 kubenswrapper[7320]: E0312 12:21:41.551945 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:41.554501 master-0 kubenswrapper[7320]: E0312 12:21:41.551970 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:43.551963171 +0000 UTC m=+26.111007052 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : configmap "client-ca" not found Mar 12 12:21:42.014048 master-0 kubenswrapper[7320]: I0312 12:21:42.013244 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" event={"ID":"aa8ddfdd-7f2d-4fd4-b666-1497dee752df","Type":"ContainerStarted","Data":"c37981d772b9a85d17d279e86121944c6ab87ebc93f5d8856b6cc53e349bc15d"} Mar 12 12:21:42.014048 master-0 kubenswrapper[7320]: I0312 12:21:42.013284 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" event={"ID":"aa8ddfdd-7f2d-4fd4-b666-1497dee752df","Type":"ContainerStarted","Data":"0618c9f5aa46fc08303406a2cb903067ccdfab22a4fdf61ae7d07562528f34f0"} Mar 12 12:21:42.014048 master-0 kubenswrapper[7320]: I0312 12:21:42.013300 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" event={"ID":"aa8ddfdd-7f2d-4fd4-b666-1497dee752df","Type":"ContainerStarted","Data":"b8ffdf31e610994af2b57b9f88a7795bae0cba26fb04c18d8c2445e3f0680a53"} Mar 12 12:21:42.014048 master-0 kubenswrapper[7320]: I0312 12:21:42.014016 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:42.029988 master-0 kubenswrapper[7320]: I0312 12:21:42.028894 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" podStartSLOduration=2.028873665 podStartE2EDuration="2.028873665s" podCreationTimestamp="2026-03-12 12:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:42.026713022 +0000 UTC m=+24.585756913" watchObservedRunningTime="2026-03-12 12:21:42.028873665 +0000 UTC m=+24.587917546" Mar 12 12:21:42.269808 master-0 kubenswrapper[7320]: I0312 12:21:42.269584 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:42.277384 master-0 kubenswrapper[7320]: I0312 12:21:42.277325 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:42.426195 master-0 kubenswrapper[7320]: I0312 12:21:42.425657 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:42.650882 master-0 kubenswrapper[7320]: I0312 12:21:42.650303 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7"] Mar 12 12:21:42.657099 master-0 kubenswrapper[7320]: W0312 12:21:42.656868 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod022dd526_0ea5_4224_9d2e_778ed4ef8a56.slice/crio-ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9 WatchSource:0}: Error finding container ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9: Status 404 returned error can't find the container with id ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9 Mar 12 12:21:43.018576 master-0 kubenswrapper[7320]: I0312 12:21:43.018428 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" event={"ID":"022dd526-0ea5-4224-9d2e-778ed4ef8a56","Type":"ContainerStarted","Data":"386158a93dec64e377fe80c39218a7c01585165e6dc5fd11bdceff85cedacb85"} Mar 12 12:21:43.018576 master-0 kubenswrapper[7320]: I0312 12:21:43.018513 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" event={"ID":"022dd526-0ea5-4224-9d2e-778ed4ef8a56","Type":"ContainerStarted","Data":"ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9"} Mar 12 12:21:43.025967 master-0 kubenswrapper[7320]: I0312 12:21:43.024647 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" event={"ID":"580bafd6-af8c-4961-b959-b736a180e309","Type":"ContainerStarted","Data":"e6df0e5279c517a4fdd0f406398e86c0a88b434fccb4c83440b9adcae14ae2f5"} Mar 12 12:21:43.025967 master-0 kubenswrapper[7320]: I0312 12:21:43.024704 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" event={"ID":"580bafd6-af8c-4961-b959-b736a180e309","Type":"ContainerStarted","Data":"7e9d5db84ee42a66980da63e1c66fd089d4f1a47b3a8e42e5916da69343fede6"} Mar 12 12:21:43.583986 master-0 kubenswrapper[7320]: I0312 12:21:43.583898 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:43.585469 master-0 kubenswrapper[7320]: E0312 12:21:43.584307 7320 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:43.585469 master-0 kubenswrapper[7320]: I0312 12:21:43.584421 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:43.585469 master-0 kubenswrapper[7320]: E0312 12:21:43.584510 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:47.58444415 +0000 UTC m=+30.143488181 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : secret "serving-cert" not found Mar 12 12:21:43.585469 master-0 kubenswrapper[7320]: E0312 12:21:43.584612 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:43.585469 master-0 kubenswrapper[7320]: E0312 12:21:43.584698 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:47.584671567 +0000 UTC m=+30.143715488 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : configmap "client-ca" not found Mar 12 12:21:43.597953 master-0 kubenswrapper[7320]: I0312 12:21:43.597911 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:43.618672 master-0 kubenswrapper[7320]: I0312 12:21:43.618175 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" podStartSLOduration=3.343611766 podStartE2EDuration="5.618149808s" podCreationTimestamp="2026-03-12 12:21:38 +0000 UTC" firstStartedPulling="2026-03-12 12:21:40.143893558 +0000 UTC m=+22.702937439" lastFinishedPulling="2026-03-12 12:21:42.41843161 +0000 UTC m=+24.977475481" observedRunningTime="2026-03-12 12:21:43.043373844 +0000 UTC m=+25.602417725" watchObservedRunningTime="2026-03-12 12:21:43.618149808 +0000 UTC m=+26.177193689" Mar 12 12:21:43.759248 master-0 kubenswrapper[7320]: I0312 12:21:43.758926 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 12 12:21:43.759829 master-0 kubenswrapper[7320]: I0312 12:21:43.759806 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.763728 master-0 kubenswrapper[7320]: I0312 12:21:43.763675 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 12 12:21:43.771303 master-0 kubenswrapper[7320]: I0312 12:21:43.770530 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 12 12:21:43.786604 master-0 kubenswrapper[7320]: I0312 12:21:43.786567 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-var-lock\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.786767 master-0 kubenswrapper[7320]: I0312 12:21:43.786635 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.786767 master-0 kubenswrapper[7320]: I0312 12:21:43.786675 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.887204 master-0 kubenswrapper[7320]: I0312 12:21:43.887080 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-var-lock\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.887204 master-0 kubenswrapper[7320]: I0312 12:21:43.887165 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.887204 master-0 kubenswrapper[7320]: I0312 12:21:43.887206 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.887442 master-0 kubenswrapper[7320]: I0312 12:21:43.887283 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.887442 master-0 kubenswrapper[7320]: I0312 12:21:43.887330 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-var-lock\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:43.906888 master-0 kubenswrapper[7320]: I0312 12:21:43.906774 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:44.029719 master-0 kubenswrapper[7320]: I0312 12:21:44.029609 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" event={"ID":"022dd526-0ea5-4224-9d2e-778ed4ef8a56","Type":"ContainerStarted","Data":"d5a1ce8cbbe911064b01e5c73e3e8c17228cb220219ccba9621023e45528295b"} Mar 12 12:21:44.047474 master-0 kubenswrapper[7320]: I0312 12:21:44.047369 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" podStartSLOduration=4.047339195 podStartE2EDuration="4.047339195s" podCreationTimestamp="2026-03-12 12:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:44.047025806 +0000 UTC m=+26.606069707" watchObservedRunningTime="2026-03-12 12:21:44.047339195 +0000 UTC m=+26.606383116" Mar 12 12:21:44.081023 master-0 kubenswrapper[7320]: I0312 12:21:44.080950 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:21:44.274997 master-0 kubenswrapper[7320]: I0312 12:21:44.274950 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 12 12:21:44.801250 master-0 kubenswrapper[7320]: I0312 12:21:44.800858 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:44.802053 master-0 kubenswrapper[7320]: I0312 12:21:44.801351 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:44.802053 master-0 kubenswrapper[7320]: E0312 12:21:44.801029 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:44.802053 master-0 kubenswrapper[7320]: E0312 12:21:44.801550 7320 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:44.802053 master-0 kubenswrapper[7320]: E0312 12:21:44.801555 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:52.801523066 +0000 UTC m=+35.360566987 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : configmap "client-ca" not found Mar 12 12:21:44.802053 master-0 kubenswrapper[7320]: E0312 12:21:44.801914 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:52.801890597 +0000 UTC m=+35.360934538 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : secret "serving-cert" not found Mar 12 12:21:45.035735 master-0 kubenswrapper[7320]: I0312 12:21:45.035538 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e7c1a86e-0ad7-4978-80ae-163dbc44fafb","Type":"ContainerStarted","Data":"b315bdbdd00e7e78faaebae53e6e5aca4dcfbe013781ad0113a093ac0097dc1b"} Mar 12 12:21:45.035735 master-0 kubenswrapper[7320]: I0312 12:21:45.035581 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e7c1a86e-0ad7-4978-80ae-163dbc44fafb","Type":"ContainerStarted","Data":"cf3d062904eac9ebbd852d158da7968f568f0b7b439f11aee315609af8d30a5c"} Mar 12 12:21:45.035735 master-0 kubenswrapper[7320]: I0312 12:21:45.035599 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:45.047280 master-0 kubenswrapper[7320]: I0312 12:21:45.047182 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=2.047159244 podStartE2EDuration="2.047159244s" podCreationTimestamp="2026-03-12 12:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:21:45.046028141 +0000 UTC m=+27.605072032" watchObservedRunningTime="2026-03-12 12:21:45.047159244 +0000 UTC m=+27.606203125" Mar 12 12:21:46.040363 master-0 kubenswrapper[7320]: I0312 12:21:46.040305 7320 generic.go:334] "Generic (PLEG): container finished" podID="a154f648-b96d-449e-b0f5-ba32266000c2" containerID="962d17c59c83032444283250e4b6771dc0674c33ed005bbb62fede3e00c87666" exitCode=0 Mar 12 12:21:46.040906 master-0 kubenswrapper[7320]: I0312 12:21:46.040424 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerDied","Data":"962d17c59c83032444283250e4b6771dc0674c33ed005bbb62fede3e00c87666"} Mar 12 12:21:46.041338 master-0 kubenswrapper[7320]: I0312 12:21:46.041315 7320 scope.go:117] "RemoveContainer" containerID="962d17c59c83032444283250e4b6771dc0674c33ed005bbb62fede3e00c87666" Mar 12 12:21:46.587600 master-0 kubenswrapper[7320]: I0312 12:21:46.587411 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:46.597801 master-0 kubenswrapper[7320]: I0312 12:21:46.597741 7320 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:47.047593 master-0 kubenswrapper[7320]: I0312 12:21:47.047536 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerStarted","Data":"ba12b4f3c2b615d1e6edfe9917ea5510fedb4efbe7773f21a5c975a8c101f165"} Mar 12 12:21:47.048111 master-0 kubenswrapper[7320]: I0312 12:21:47.047900 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:47.640819 master-0 kubenswrapper[7320]: I0312 12:21:47.640736 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:47.641052 master-0 kubenswrapper[7320]: E0312 12:21:47.640923 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:47.641052 master-0 kubenswrapper[7320]: I0312 12:21:47.641011 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:47.641141 master-0 kubenswrapper[7320]: E0312 12:21:47.641030 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:55.641006374 +0000 UTC m=+38.200050255 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : configmap "client-ca" not found Mar 12 12:21:47.648915 master-0 kubenswrapper[7320]: I0312 12:21:47.648858 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:49.594644 master-0 kubenswrapper[7320]: I0312 12:21:49.594006 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: I0312 12:21:50.372159 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: I0312 12:21:50.372250 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: I0312 12:21:50.372422 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: E0312 12:21:50.372676 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: E0312 12:21:50.372761 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.372735916 +0000 UTC m=+64.931779837 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : secret "catalog-operator-serving-cert" not found Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: I0312 12:21:50.373281 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:50.373498 master-0 kubenswrapper[7320]: I0312 12:21:50.373334 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:50.373894 master-0 kubenswrapper[7320]: E0312 12:21:50.373621 7320 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 12 12:21:50.373894 master-0 kubenswrapper[7320]: E0312 12:21:50.373752 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs podName:e64bc838-280e-4231-9732-1adb69fed0bc nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.373713544 +0000 UTC m=+64.932757475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs") pod "network-metrics-daemon-4m9jh" (UID: "e64bc838-280e-4231-9732-1adb69fed0bc") : secret "metrics-daemon-secret" not found Mar 12 12:21:50.384007 master-0 kubenswrapper[7320]: I0312 12:21:50.382607 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:50.400506 master-0 kubenswrapper[7320]: I0312 12:21:50.394375 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:50.410222 master-0 kubenswrapper[7320]: I0312 12:21:50.406355 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:50.476036 master-0 kubenswrapper[7320]: I0312 12:21:50.475961 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:21:50.476268 master-0 kubenswrapper[7320]: E0312 12:21:50.476140 7320 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 12 12:21:50.476268 master-0 kubenswrapper[7320]: I0312 12:21:50.476207 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:21:50.476268 master-0 kubenswrapper[7320]: E0312 12:21:50.476223 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs podName:74d06933-afab-43a3-a1d3-88a569178d34 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.476200818 +0000 UTC m=+65.035244709 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs") pod "multus-admission-controller-8d675b596-xpzn2" (UID: "74d06933-afab-43a3-a1d3-88a569178d34") : secret "multus-admission-controller-secret" not found Mar 12 12:21:50.476427 master-0 kubenswrapper[7320]: E0312 12:21:50.476277 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 12 12:21:50.476427 master-0 kubenswrapper[7320]: E0312 12:21:50.476304 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.47629545 +0000 UTC m=+65.035339341 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : secret "olm-operator-serving-cert" not found Mar 12 12:21:50.476427 master-0 kubenswrapper[7320]: I0312 12:21:50.476278 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:50.476427 master-0 kubenswrapper[7320]: I0312 12:21:50.476371 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:21:50.476627 master-0 kubenswrapper[7320]: E0312 12:21:50.476545 7320 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 12 12:21:50.476627 master-0 kubenswrapper[7320]: E0312 12:21:50.476617 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.476600909 +0000 UTC m=+65.035644790 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : secret "package-server-manager-serving-cert" not found Mar 12 12:21:50.477036 master-0 kubenswrapper[7320]: I0312 12:21:50.476991 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:50.477112 master-0 kubenswrapper[7320]: I0312 12:21:50.477067 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:21:50.477203 master-0 kubenswrapper[7320]: I0312 12:21:50.477170 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:21:50.477256 master-0 kubenswrapper[7320]: E0312 12:21:50.477203 7320 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:50.477256 master-0 kubenswrapper[7320]: E0312 12:21:50.477251 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls podName:ae2269d7-f11f-46d1-95e7-f89a70ee1152 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.477239178 +0000 UTC m=+65.036283059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-tztzr" (UID: "ae2269d7-f11f-46d1-95e7-f89a70ee1152") : secret "cluster-monitoring-operator-tls" not found Mar 12 12:21:50.477363 master-0 kubenswrapper[7320]: E0312 12:21:50.477297 7320 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 12 12:21:50.477363 master-0 kubenswrapper[7320]: E0312 12:21:50.477352 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:22:22.477337981 +0000 UTC m=+65.036381922 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : secret "marketplace-operator-metrics" not found Mar 12 12:21:50.479292 master-0 kubenswrapper[7320]: I0312 12:21:50.479250 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:50.480583 master-0 kubenswrapper[7320]: I0312 12:21:50.480547 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:50.578441 master-0 kubenswrapper[7320]: I0312 12:21:50.578374 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:50.582807 master-0 kubenswrapper[7320]: I0312 12:21:50.582763 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"cluster-version-operator-745944c6b7-b2t49\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:50.682731 master-0 kubenswrapper[7320]: I0312 12:21:50.682561 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:21:50.685449 master-0 kubenswrapper[7320]: I0312 12:21:50.685373 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:21:50.697314 master-0 kubenswrapper[7320]: I0312 12:21:50.695446 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:21:50.697570 master-0 kubenswrapper[7320]: I0312 12:21:50.697531 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:21:50.700882 master-0 kubenswrapper[7320]: I0312 12:21:50.700827 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:21:50.737778 master-0 kubenswrapper[7320]: W0312 12:21:50.737727 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6f7496_1047_482d_9203_ff83a9eb7d93.slice/crio-d8d6e95189dd09735480728d654d01869ba9243b35fbe6a700a9456de7ccf78e WatchSource:0}: Error finding container d8d6e95189dd09735480728d654d01869ba9243b35fbe6a700a9456de7ccf78e: Status 404 returned error can't find the container with id d8d6e95189dd09735480728d654d01869ba9243b35fbe6a700a9456de7ccf78e Mar 12 12:21:51.074935 master-0 kubenswrapper[7320]: I0312 12:21:51.072530 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:21:51.078850 master-0 kubenswrapper[7320]: I0312 12:21:51.077796 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" event={"ID":"8e6f7496-1047-482d-9203-ff83a9eb7d93","Type":"ContainerStarted","Data":"d8d6e95189dd09735480728d654d01869ba9243b35fbe6a700a9456de7ccf78e"} Mar 12 12:21:52.283867 master-0 kubenswrapper[7320]: I0312 12:21:52.283805 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-vpss8"] Mar 12 12:21:52.285522 master-0 kubenswrapper[7320]: I0312 12:21:52.285485 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-l8x6p"] Mar 12 12:21:52.294455 master-0 kubenswrapper[7320]: W0312 12:21:52.294399 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3f295ac_7bc7_43b7_bd30_db82e7f16cd7.slice/crio-57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1 WatchSource:0}: Error finding container 57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1: Status 404 returned error can't find the container with id 57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1 Mar 12 12:21:52.429359 master-0 kubenswrapper[7320]: I0312 12:21:52.429313 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:21:52.892302 master-0 kubenswrapper[7320]: I0312 12:21:52.892185 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:52.892623 master-0 kubenswrapper[7320]: E0312 12:21:52.892355 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:52.892623 master-0 kubenswrapper[7320]: E0312 12:21:52.892444 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:08.892418902 +0000 UTC m=+51.451462783 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : configmap "client-ca" not found Mar 12 12:21:52.892955 master-0 kubenswrapper[7320]: I0312 12:21:52.892842 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") pod \"route-controller-manager-75fd5ccdf9-b6rwb\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:52.893138 master-0 kubenswrapper[7320]: E0312 12:21:52.893084 7320 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 12 12:21:52.893235 master-0 kubenswrapper[7320]: E0312 12:21:52.893208 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert podName:98d6d8ce-3a64-4c9d-956a-6325a877d871 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:08.893179294 +0000 UTC m=+51.452223215 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert") pod "route-controller-manager-75fd5ccdf9-b6rwb" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871") : secret "serving-cert" not found Mar 12 12:21:53.088355 master-0 kubenswrapper[7320]: I0312 12:21:53.088295 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" event={"ID":"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7","Type":"ContainerStarted","Data":"57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1"} Mar 12 12:21:53.090156 master-0 kubenswrapper[7320]: I0312 12:21:53.090114 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" event={"ID":"a22189f2-3f35-4ea6-9892-39a1b46637e2","Type":"ContainerStarted","Data":"a31234df882122d07e602b31ef9412c0c41c548e3f6b29cd809ca5a3f68cae28"} Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.813591 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp"] Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.815767 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-7bbc7986c6-gvnt6"] Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.817003 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.836691 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.836930 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.837046 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4"] Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.837317 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.837594 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.838970 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 12:21:53.841407 master-0 kubenswrapper[7320]: I0312 12:21:53.839168 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 12:21:53.844495 master-0 kubenswrapper[7320]: I0312 12:21:53.843546 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 12 12:21:53.844495 master-0 kubenswrapper[7320]: I0312 12:21:53.844194 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 12:21:53.845170 master-0 kubenswrapper[7320]: I0312 12:21:53.844877 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 12:21:53.852880 master-0 kubenswrapper[7320]: I0312 12:21:53.848216 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 12:21:53.916918 master-0 kubenswrapper[7320]: I0312 12:21:53.916869 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.916918 master-0 kubenswrapper[7320]: I0312 12:21:53.916915 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srl2k\" (UniqueName: \"kubernetes.io/projected/bed45b99-434b-4580-a4cc-ac2f62d32268-kube-api-access-srl2k\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917135 master-0 kubenswrapper[7320]: I0312 12:21:53.916944 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-client\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917135 master-0 kubenswrapper[7320]: I0312 12:21:53.916973 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-config\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917135 master-0 kubenswrapper[7320]: I0312 12:21:53.917028 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-node-pullsecrets\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917135 master-0 kubenswrapper[7320]: I0312 12:21:53.917119 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917338 master-0 kubenswrapper[7320]: I0312 12:21:53.917139 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-encryption-config\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917338 master-0 kubenswrapper[7320]: I0312 12:21:53.917156 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-audit-dir\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917338 master-0 kubenswrapper[7320]: I0312 12:21:53.917181 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-trusted-ca-bundle\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917338 master-0 kubenswrapper[7320]: I0312 12:21:53.917209 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-image-import-ca\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:53.917338 master-0 kubenswrapper[7320]: I0312 12:21:53.917312 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-serving-ca\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.018746 master-0 kubenswrapper[7320]: I0312 12:21:54.018684 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.018947 master-0 kubenswrapper[7320]: I0312 12:21:54.018756 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srl2k\" (UniqueName: \"kubernetes.io/projected/bed45b99-434b-4580-a4cc-ac2f62d32268-kube-api-access-srl2k\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019085 master-0 kubenswrapper[7320]: I0312 12:21:54.019048 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-client\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019139 master-0 kubenswrapper[7320]: E0312 12:21:54.019063 7320 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 12 12:21:54.019139 master-0 kubenswrapper[7320]: I0312 12:21:54.019116 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-config\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019223 master-0 kubenswrapper[7320]: I0312 12:21:54.019156 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-node-pullsecrets\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019223 master-0 kubenswrapper[7320]: E0312 12:21:54.019189 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert podName:bed45b99-434b-4580-a4cc-ac2f62d32268 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:54.51915004 +0000 UTC m=+37.078193951 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert") pod "apiserver-7bbc7986c6-gvnt6" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268") : secret "serving-cert" not found Mar 12 12:21:54.019536 master-0 kubenswrapper[7320]: I0312 12:21:54.019500 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019590 master-0 kubenswrapper[7320]: I0312 12:21:54.019558 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-audit-dir\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019631 master-0 kubenswrapper[7320]: I0312 12:21:54.019595 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-encryption-config\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019671 master-0 kubenswrapper[7320]: I0312 12:21:54.019642 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-trusted-ca-bundle\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019719 master-0 kubenswrapper[7320]: I0312 12:21:54.019699 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-image-import-ca\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019813 master-0 kubenswrapper[7320]: I0312 12:21:54.019789 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-serving-ca\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019924 master-0 kubenswrapper[7320]: I0312 12:21:54.019857 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-node-pullsecrets\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.019984 master-0 kubenswrapper[7320]: I0312 12:21:54.019784 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-audit-dir\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.020043 master-0 kubenswrapper[7320]: E0312 12:21:54.020023 7320 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 12 12:21:54.020083 master-0 kubenswrapper[7320]: E0312 12:21:54.020070 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit podName:bed45b99-434b-4580-a4cc-ac2f62d32268 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:54.520054746 +0000 UTC m=+37.079098627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit") pod "apiserver-7bbc7986c6-gvnt6" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268") : configmap "audit-0" not found Mar 12 12:21:54.020686 master-0 kubenswrapper[7320]: I0312 12:21:54.020386 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-config\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.021491 master-0 kubenswrapper[7320]: I0312 12:21:54.021431 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-image-import-ca\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.022419 master-0 kubenswrapper[7320]: I0312 12:21:54.022365 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-serving-ca\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.022583 master-0 kubenswrapper[7320]: I0312 12:21:54.022518 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-trusted-ca-bundle\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.027726 master-0 kubenswrapper[7320]: I0312 12:21:54.027676 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-encryption-config\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.045640 master-0 kubenswrapper[7320]: I0312 12:21:54.045586 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-client\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.098132 master-0 kubenswrapper[7320]: I0312 12:21:54.096435 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" event={"ID":"cfd178d7-f518-413b-95ab-ab6687be6e0f","Type":"ContainerStarted","Data":"700144f71aaf4edd85bba4cceda2dd6a1013711fa671c636535cf91528c721c3"} Mar 12 12:21:54.098132 master-0 kubenswrapper[7320]: I0312 12:21:54.097859 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" event={"ID":"b9194868-75ce-4138-a9d4-ddd64660c529","Type":"ContainerStarted","Data":"4e68f4bc560c5df24d0c2f98d45fcf56d54f2e3305701c6372ad15ea940098c4"} Mar 12 12:21:54.156290 master-0 kubenswrapper[7320]: I0312 12:21:54.152240 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7bbc7986c6-gvnt6"] Mar 12 12:21:54.221857 master-0 kubenswrapper[7320]: I0312 12:21:54.219767 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-b86747447-pj5gz"] Mar 12 12:21:54.229297 master-0 kubenswrapper[7320]: I0312 12:21:54.229245 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srl2k\" (UniqueName: \"kubernetes.io/projected/bed45b99-434b-4580-a4cc-ac2f62d32268-kube-api-access-srl2k\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.233028 master-0 kubenswrapper[7320]: I0312 12:21:54.232980 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.238103 master-0 kubenswrapper[7320]: I0312 12:21:54.238058 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.238350 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.238503 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.238608 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.238756 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.238888 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.239004 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 12:21:54.240247 master-0 kubenswrapper[7320]: I0312 12:21:54.239155 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 12:21:54.342584 master-0 kubenswrapper[7320]: I0312 12:21:54.342537 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-client\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.342817 master-0 kubenswrapper[7320]: I0312 12:21:54.342607 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8tpp\" (UniqueName: \"kubernetes.io/projected/a87e74f5-3187-44b7-8125-2349647dac7c-kube-api-access-s8tpp\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.342817 master-0 kubenswrapper[7320]: I0312 12:21:54.342686 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-trusted-ca-bundle\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.342817 master-0 kubenswrapper[7320]: I0312 12:21:54.342712 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-encryption-config\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.342817 master-0 kubenswrapper[7320]: I0312 12:21:54.342770 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-serving-ca\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.342817 master-0 kubenswrapper[7320]: I0312 12:21:54.342793 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a87e74f5-3187-44b7-8125-2349647dac7c-audit-dir\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.343022 master-0 kubenswrapper[7320]: I0312 12:21:54.342831 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-audit-policies\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.343022 master-0 kubenswrapper[7320]: I0312 12:21:54.342852 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-serving-cert\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.346986 master-0 kubenswrapper[7320]: I0312 12:21:54.346946 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-b86747447-pj5gz"] Mar 12 12:21:54.443607 master-0 kubenswrapper[7320]: I0312 12:21:54.443540 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-trusted-ca-bundle\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.443810 master-0 kubenswrapper[7320]: I0312 12:21:54.443616 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-encryption-config\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.443810 master-0 kubenswrapper[7320]: I0312 12:21:54.443710 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-serving-ca\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.443810 master-0 kubenswrapper[7320]: I0312 12:21:54.443763 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a87e74f5-3187-44b7-8125-2349647dac7c-audit-dir\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.443908 master-0 kubenswrapper[7320]: I0312 12:21:54.443801 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-audit-policies\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.443908 master-0 kubenswrapper[7320]: I0312 12:21:54.443846 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-serving-cert\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.443968 master-0 kubenswrapper[7320]: I0312 12:21:54.443911 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-client\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.444001 master-0 kubenswrapper[7320]: I0312 12:21:54.443958 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8tpp\" (UniqueName: \"kubernetes.io/projected/a87e74f5-3187-44b7-8125-2349647dac7c-kube-api-access-s8tpp\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.444167 master-0 kubenswrapper[7320]: I0312 12:21:54.444137 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a87e74f5-3187-44b7-8125-2349647dac7c-audit-dir\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.447540 master-0 kubenswrapper[7320]: I0312 12:21:54.445164 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-trusted-ca-bundle\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.447540 master-0 kubenswrapper[7320]: I0312 12:21:54.445670 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-audit-policies\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.448292 master-0 kubenswrapper[7320]: I0312 12:21:54.448232 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-client\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.448381 master-0 kubenswrapper[7320]: I0312 12:21:54.448362 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-serving-ca\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.451495 master-0 kubenswrapper[7320]: I0312 12:21:54.449896 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-serving-cert\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.451795 master-0 kubenswrapper[7320]: I0312 12:21:54.451778 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-encryption-config\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.545109 master-0 kubenswrapper[7320]: I0312 12:21:54.545065 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.545507 master-0 kubenswrapper[7320]: I0312 12:21:54.545465 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:54.555684 master-0 kubenswrapper[7320]: E0312 12:21:54.555628 7320 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 12 12:21:54.555874 master-0 kubenswrapper[7320]: E0312 12:21:54.555734 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit podName:bed45b99-434b-4580-a4cc-ac2f62d32268 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:55.555711284 +0000 UTC m=+38.114755175 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit") pod "apiserver-7bbc7986c6-gvnt6" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268") : configmap "audit-0" not found Mar 12 12:21:54.555951 master-0 kubenswrapper[7320]: E0312 12:21:54.555900 7320 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 12 12:21:54.556035 master-0 kubenswrapper[7320]: E0312 12:21:54.556006 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert podName:bed45b99-434b-4580-a4cc-ac2f62d32268 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:55.555978731 +0000 UTC m=+38.115022662 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert") pod "apiserver-7bbc7986c6-gvnt6" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268") : secret "serving-cert" not found Mar 12 12:21:54.671276 master-0 kubenswrapper[7320]: I0312 12:21:54.671207 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8tpp\" (UniqueName: \"kubernetes.io/projected/a87e74f5-3187-44b7-8125-2349647dac7c-kube-api-access-s8tpp\") pod \"apiserver-b86747447-pj5gz\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:54.856109 master-0 kubenswrapper[7320]: I0312 12:21:54.855980 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:21:55.221852 master-0 kubenswrapper[7320]: I0312 12:21:55.221781 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-b86747447-pj5gz"] Mar 12 12:21:55.568157 master-0 kubenswrapper[7320]: I0312 12:21:55.567657 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:55.568157 master-0 kubenswrapper[7320]: I0312 12:21:55.568066 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:55.568526 master-0 kubenswrapper[7320]: E0312 12:21:55.568205 7320 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 12 12:21:55.568526 master-0 kubenswrapper[7320]: E0312 12:21:55.568269 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit podName:bed45b99-434b-4580-a4cc-ac2f62d32268 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:57.568251726 +0000 UTC m=+40.127295607 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit") pod "apiserver-7bbc7986c6-gvnt6" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268") : configmap "audit-0" not found Mar 12 12:21:55.577671 master-0 kubenswrapper[7320]: I0312 12:21:55.574696 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:55.588093 master-0 kubenswrapper[7320]: I0312 12:21:55.588044 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:21:55.670416 master-0 kubenswrapper[7320]: I0312 12:21:55.669310 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65bb95677d-rqwcb"] Mar 12 12:21:55.670416 master-0 kubenswrapper[7320]: E0312 12:21:55.669634 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" podUID="33619450-d04b-4cb7-999b-2b3631fc3e91" Mar 12 12:21:55.670416 master-0 kubenswrapper[7320]: I0312 12:21:55.669729 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") pod \"controller-manager-65bb95677d-rqwcb\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:55.670416 master-0 kubenswrapper[7320]: E0312 12:21:55.669845 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:55.670416 master-0 kubenswrapper[7320]: E0312 12:21:55.669892 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca podName:33619450-d04b-4cb7-999b-2b3631fc3e91 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:11.669877354 +0000 UTC m=+54.228921235 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca") pod "controller-manager-65bb95677d-rqwcb" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91") : configmap "client-ca" not found Mar 12 12:21:55.671380 master-0 kubenswrapper[7320]: I0312 12:21:55.671160 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb"] Mar 12 12:21:55.671946 master-0 kubenswrapper[7320]: E0312 12:21:55.671828 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" podUID="98d6d8ce-3a64-4c9d-956a-6325a877d871" Mar 12 12:21:56.073733 master-0 kubenswrapper[7320]: I0312 12:21:56.073676 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-b86747447-pj5gz"] Mar 12 12:21:56.112471 master-0 kubenswrapper[7320]: I0312 12:21:56.112280 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" event={"ID":"a87e74f5-3187-44b7-8125-2349647dac7c","Type":"ContainerStarted","Data":"47b3552265d71c98eea3d38ead40654ad4166a125299885e5f9f87f8e67a66bc"} Mar 12 12:21:56.112471 master-0 kubenswrapper[7320]: I0312 12:21:56.112326 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:56.112471 master-0 kubenswrapper[7320]: I0312 12:21:56.112305 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:56.124801 master-0 kubenswrapper[7320]: I0312 12:21:56.123087 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:56.133055 master-0 kubenswrapper[7320]: I0312 12:21:56.133015 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:56.174768 master-0 kubenswrapper[7320]: I0312 12:21:56.174712 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") pod \"33619450-d04b-4cb7-999b-2b3631fc3e91\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " Mar 12 12:21:56.174970 master-0 kubenswrapper[7320]: I0312 12:21:56.174787 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-config\") pod \"98d6d8ce-3a64-4c9d-956a-6325a877d871\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " Mar 12 12:21:56.174970 master-0 kubenswrapper[7320]: I0312 12:21:56.174828 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-proxy-ca-bundles\") pod \"33619450-d04b-4cb7-999b-2b3631fc3e91\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " Mar 12 12:21:56.174970 master-0 kubenswrapper[7320]: I0312 12:21:56.174851 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnb5q\" (UniqueName: \"kubernetes.io/projected/33619450-d04b-4cb7-999b-2b3631fc3e91-kube-api-access-dnb5q\") pod \"33619450-d04b-4cb7-999b-2b3631fc3e91\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " Mar 12 12:21:56.174970 master-0 kubenswrapper[7320]: I0312 12:21:56.174883 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wll8q\" (UniqueName: \"kubernetes.io/projected/98d6d8ce-3a64-4c9d-956a-6325a877d871-kube-api-access-wll8q\") pod \"98d6d8ce-3a64-4c9d-956a-6325a877d871\" (UID: \"98d6d8ce-3a64-4c9d-956a-6325a877d871\") " Mar 12 12:21:56.174970 master-0 kubenswrapper[7320]: I0312 12:21:56.174920 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-config\") pod \"33619450-d04b-4cb7-999b-2b3631fc3e91\" (UID: \"33619450-d04b-4cb7-999b-2b3631fc3e91\") " Mar 12 12:21:56.175444 master-0 kubenswrapper[7320]: I0312 12:21:56.175397 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "33619450-d04b-4cb7-999b-2b3631fc3e91" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:56.176175 master-0 kubenswrapper[7320]: I0312 12:21:56.176060 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-config" (OuterVolumeSpecName: "config") pod "33619450-d04b-4cb7-999b-2b3631fc3e91" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:56.176242 master-0 kubenswrapper[7320]: I0312 12:21:56.176225 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-config" (OuterVolumeSpecName: "config") pod "98d6d8ce-3a64-4c9d-956a-6325a877d871" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:56.198741 master-0 kubenswrapper[7320]: I0312 12:21:56.198660 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33619450-d04b-4cb7-999b-2b3631fc3e91" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:21:56.198741 master-0 kubenswrapper[7320]: I0312 12:21:56.198700 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98d6d8ce-3a64-4c9d-956a-6325a877d871-kube-api-access-wll8q" (OuterVolumeSpecName: "kube-api-access-wll8q") pod "98d6d8ce-3a64-4c9d-956a-6325a877d871" (UID: "98d6d8ce-3a64-4c9d-956a-6325a877d871"). InnerVolumeSpecName "kube-api-access-wll8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:21:56.198741 master-0 kubenswrapper[7320]: I0312 12:21:56.198669 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33619450-d04b-4cb7-999b-2b3631fc3e91-kube-api-access-dnb5q" (OuterVolumeSpecName: "kube-api-access-dnb5q") pod "33619450-d04b-4cb7-999b-2b3631fc3e91" (UID: "33619450-d04b-4cb7-999b-2b3631fc3e91"). InnerVolumeSpecName "kube-api-access-dnb5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:21:56.236210 master-0 kubenswrapper[7320]: I0312 12:21:56.236077 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-7bbc7986c6-gvnt6"] Mar 12 12:21:56.236416 master-0 kubenswrapper[7320]: E0312 12:21:56.236362 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" podUID="bed45b99-434b-4580-a4cc-ac2f62d32268" Mar 12 12:21:56.281570 master-0 kubenswrapper[7320]: I0312 12:21:56.281197 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33619450-d04b-4cb7-999b-2b3631fc3e91-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:56.281570 master-0 kubenswrapper[7320]: I0312 12:21:56.281231 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:56.281570 master-0 kubenswrapper[7320]: I0312 12:21:56.281242 7320 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:56.281570 master-0 kubenswrapper[7320]: I0312 12:21:56.281250 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnb5q\" (UniqueName: \"kubernetes.io/projected/33619450-d04b-4cb7-999b-2b3631fc3e91-kube-api-access-dnb5q\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:56.281570 master-0 kubenswrapper[7320]: I0312 12:21:56.281261 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wll8q\" (UniqueName: \"kubernetes.io/projected/98d6d8ce-3a64-4c9d-956a-6325a877d871-kube-api-access-wll8q\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:56.281570 master-0 kubenswrapper[7320]: I0312 12:21:56.281270 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:56.357637 master-0 kubenswrapper[7320]: I0312 12:21:56.357512 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 12 12:21:56.357943 master-0 kubenswrapper[7320]: I0312 12:21:56.357889 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" containerID="cri-o://b315bdbdd00e7e78faaebae53e6e5aca4dcfbe013781ad0113a093ac0097dc1b" gracePeriod=30 Mar 12 12:21:57.117965 master-0 kubenswrapper[7320]: I0312 12:21:57.117899 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-65bb95677d-rqwcb" Mar 12 12:21:57.117965 master-0 kubenswrapper[7320]: I0312 12:21:57.117951 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:57.119666 master-0 kubenswrapper[7320]: I0312 12:21:57.117910 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb" Mar 12 12:21:57.126151 master-0 kubenswrapper[7320]: I0312 12:21:57.126102 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:57.155172 master-0 kubenswrapper[7320]: I0312 12:21:57.154662 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr"] Mar 12 12:21:57.167763 master-0 kubenswrapper[7320]: I0312 12:21:57.166468 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.172298 master-0 kubenswrapper[7320]: I0312 12:21:57.171837 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 12:21:57.172298 master-0 kubenswrapper[7320]: I0312 12:21:57.171968 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 12:21:57.172718 master-0 kubenswrapper[7320]: I0312 12:21:57.172666 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 12:21:57.172815 master-0 kubenswrapper[7320]: I0312 12:21:57.172780 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 12:21:57.172937 master-0 kubenswrapper[7320]: I0312 12:21:57.172913 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 12:21:57.175523 master-0 kubenswrapper[7320]: I0312 12:21:57.175468 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb"] Mar 12 12:21:57.177111 master-0 kubenswrapper[7320]: I0312 12:21:57.176490 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75fd5ccdf9-b6rwb"] Mar 12 12:21:57.177646 master-0 kubenswrapper[7320]: I0312 12:21:57.177616 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr"] Mar 12 12:21:57.185533 master-0 kubenswrapper[7320]: I0312 12:21:57.185108 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-65bb95677d-rqwcb"] Mar 12 12:21:57.191811 master-0 kubenswrapper[7320]: I0312 12:21:57.191398 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.191811 master-0 kubenswrapper[7320]: I0312 12:21:57.191438 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-trusted-ca-bundle\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.191811 master-0 kubenswrapper[7320]: I0312 12:21:57.191496 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-client\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.191811 master-0 kubenswrapper[7320]: I0312 12:21:57.191714 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-audit-dir\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.191811 master-0 kubenswrapper[7320]: I0312 12:21:57.191760 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-config\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.191811 master-0 kubenswrapper[7320]: I0312 12:21:57.191793 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srl2k\" (UniqueName: \"kubernetes.io/projected/bed45b99-434b-4580-a4cc-ac2f62d32268-kube-api-access-srl2k\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.192089 master-0 kubenswrapper[7320]: I0312 12:21:57.191833 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-serving-ca\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.192089 master-0 kubenswrapper[7320]: I0312 12:21:57.191859 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-node-pullsecrets\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.192089 master-0 kubenswrapper[7320]: I0312 12:21:57.191887 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-image-import-ca\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.192089 master-0 kubenswrapper[7320]: I0312 12:21:57.191825 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:21:57.192089 master-0 kubenswrapper[7320]: I0312 12:21:57.191938 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-encryption-config\") pod \"bed45b99-434b-4580-a4cc-ac2f62d32268\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " Mar 12 12:21:57.192089 master-0 kubenswrapper[7320]: I0312 12:21:57.191984 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:21:57.192296 master-0 kubenswrapper[7320]: I0312 12:21:57.192269 7320 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.192296 master-0 kubenswrapper[7320]: I0312 12:21:57.192289 7320 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bed45b99-434b-4580-a4cc-ac2f62d32268-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.192558 master-0 kubenswrapper[7320]: I0312 12:21:57.192533 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:57.193156 master-0 kubenswrapper[7320]: I0312 12:21:57.193016 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:57.193301 master-0 kubenswrapper[7320]: I0312 12:21:57.193266 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-65bb95677d-rqwcb"] Mar 12 12:21:57.193363 master-0 kubenswrapper[7320]: I0312 12:21:57.193332 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:57.193560 master-0 kubenswrapper[7320]: I0312 12:21:57.193528 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-config" (OuterVolumeSpecName: "config") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:21:57.195135 master-0 kubenswrapper[7320]: I0312 12:21:57.195098 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:21:57.195506 master-0 kubenswrapper[7320]: I0312 12:21:57.195459 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed45b99-434b-4580-a4cc-ac2f62d32268-kube-api-access-srl2k" (OuterVolumeSpecName: "kube-api-access-srl2k") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "kube-api-access-srl2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:21:57.196650 master-0 kubenswrapper[7320]: I0312 12:21:57.196621 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:21:57.197231 master-0 kubenswrapper[7320]: I0312 12:21:57.197200 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "bed45b99-434b-4580-a4cc-ac2f62d32268" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:21:57.293869 master-0 kubenswrapper[7320]: I0312 12:21:57.293803 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45da9d2-861a-497c-b237-290c70e65ef3-serving-cert\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.293869 master-0 kubenswrapper[7320]: I0312 12:21:57.293866 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.294082 master-0 kubenswrapper[7320]: I0312 12:21:57.293919 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7qnt\" (UniqueName: \"kubernetes.io/projected/d45da9d2-861a-497c-b237-290c70e65ef3-kube-api-access-g7qnt\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.294082 master-0 kubenswrapper[7320]: I0312 12:21:57.293962 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-config\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.294082 master-0 kubenswrapper[7320]: I0312 12:21:57.294039 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98d6d8ce-3a64-4c9d-956a-6325a877d871-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294082 master-0 kubenswrapper[7320]: I0312 12:21:57.294082 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294095 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srl2k\" (UniqueName: \"kubernetes.io/projected/bed45b99-434b-4580-a4cc-ac2f62d32268-kube-api-access-srl2k\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294107 7320 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98d6d8ce-3a64-4c9d-956a-6325a877d871-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294134 7320 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294151 7320 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294163 7320 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33619450-d04b-4cb7-999b-2b3631fc3e91-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294176 7320 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294187 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294197 master-0 kubenswrapper[7320]: I0312 12:21:57.294198 7320 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.294443 master-0 kubenswrapper[7320]: I0312 12:21:57.294210 7320 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bed45b99-434b-4580-a4cc-ac2f62d32268-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:57.395542 master-0 kubenswrapper[7320]: I0312 12:21:57.395416 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45da9d2-861a-497c-b237-290c70e65ef3-serving-cert\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.395542 master-0 kubenswrapper[7320]: I0312 12:21:57.395466 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.395542 master-0 kubenswrapper[7320]: I0312 12:21:57.395505 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7qnt\" (UniqueName: \"kubernetes.io/projected/d45da9d2-861a-497c-b237-290c70e65ef3-kube-api-access-g7qnt\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.395761 master-0 kubenswrapper[7320]: E0312 12:21:57.395659 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:57.395761 master-0 kubenswrapper[7320]: E0312 12:21:57.395719 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca podName:d45da9d2-861a-497c-b237-290c70e65ef3 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:57.895700408 +0000 UTC m=+40.454744289 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca") pod "route-controller-manager-9bbcffc94-dpvxr" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3") : configmap "client-ca" not found Mar 12 12:21:57.395761 master-0 kubenswrapper[7320]: I0312 12:21:57.395755 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-config\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.397184 master-0 kubenswrapper[7320]: I0312 12:21:57.397150 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-config\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.411891 master-0 kubenswrapper[7320]: I0312 12:21:57.411824 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7qnt\" (UniqueName: \"kubernetes.io/projected/d45da9d2-861a-497c-b237-290c70e65ef3-kube-api-access-g7qnt\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.412734 master-0 kubenswrapper[7320]: I0312 12:21:57.412690 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45da9d2-861a-497c-b237-290c70e65ef3-serving-cert\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.598106 master-0 kubenswrapper[7320]: I0312 12:21:57.598016 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit\") pod \"apiserver-7bbc7986c6-gvnt6\" (UID: \"bed45b99-434b-4580-a4cc-ac2f62d32268\") " pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:57.598310 master-0 kubenswrapper[7320]: E0312 12:21:57.598167 7320 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 12 12:21:57.598310 master-0 kubenswrapper[7320]: E0312 12:21:57.598246 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit podName:bed45b99-434b-4580-a4cc-ac2f62d32268 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:01.598228573 +0000 UTC m=+44.157272454 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit") pod "apiserver-7bbc7986c6-gvnt6" (UID: "bed45b99-434b-4580-a4cc-ac2f62d32268") : configmap "audit-0" not found Mar 12 12:21:57.762436 master-0 kubenswrapper[7320]: I0312 12:21:57.762062 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33619450-d04b-4cb7-999b-2b3631fc3e91" path="/var/lib/kubelet/pods/33619450-d04b-4cb7-999b-2b3631fc3e91/volumes" Mar 12 12:21:57.762914 master-0 kubenswrapper[7320]: I0312 12:21:57.762878 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98d6d8ce-3a64-4c9d-956a-6325a877d871" path="/var/lib/kubelet/pods/98d6d8ce-3a64-4c9d-956a-6325a877d871/volumes" Mar 12 12:21:57.873443 master-0 kubenswrapper[7320]: I0312 12:21:57.873394 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 12 12:21:57.873971 master-0 kubenswrapper[7320]: I0312 12:21:57.873950 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:57.875814 master-0 kubenswrapper[7320]: I0312 12:21:57.875701 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 12 12:21:57.878924 master-0 kubenswrapper[7320]: I0312 12:21:57.878892 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 12 12:21:57.900759 master-0 kubenswrapper[7320]: I0312 12:21:57.900713 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:57.900864 master-0 kubenswrapper[7320]: E0312 12:21:57.900823 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:57.901026 master-0 kubenswrapper[7320]: E0312 12:21:57.900948 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca podName:d45da9d2-861a-497c-b237-290c70e65ef3 nodeName:}" failed. No retries permitted until 2026-03-12 12:21:58.900928413 +0000 UTC m=+41.459972294 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca") pod "route-controller-manager-9bbcffc94-dpvxr" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3") : configmap "client-ca" not found Mar 12 12:21:58.001901 master-0 kubenswrapper[7320]: I0312 12:21:58.001842 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-var-lock\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.001901 master-0 kubenswrapper[7320]: I0312 12:21:58.001896 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kube-api-access\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.002157 master-0 kubenswrapper[7320]: I0312 12:21:58.001976 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.103416 master-0 kubenswrapper[7320]: I0312 12:21:58.103293 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-var-lock\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.103416 master-0 kubenswrapper[7320]: I0312 12:21:58.103344 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kube-api-access\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.103677 master-0 kubenswrapper[7320]: I0312 12:21:58.103431 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-var-lock\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.103677 master-0 kubenswrapper[7320]: I0312 12:21:58.103541 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.103677 master-0 kubenswrapper[7320]: I0312 12:21:58.103654 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.120128 master-0 kubenswrapper[7320]: I0312 12:21:58.120101 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7bbc7986c6-gvnt6" Mar 12 12:21:58.584184 master-0 kubenswrapper[7320]: I0312 12:21:58.584132 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kube-api-access\") pod \"installer-1-master-0\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.599442 master-0 kubenswrapper[7320]: I0312 12:21:58.598440 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-7bbc7986c6-gvnt6"] Mar 12 12:21:58.606355 master-0 kubenswrapper[7320]: I0312 12:21:58.600726 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-7bbc7986c6-gvnt6"] Mar 12 12:21:58.711392 master-0 kubenswrapper[7320]: I0312 12:21:58.711329 7320 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bed45b99-434b-4580-a4cc-ac2f62d32268-audit\") on node \"master-0\" DevicePath \"\"" Mar 12 12:21:58.763748 master-0 kubenswrapper[7320]: I0312 12:21:58.763695 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 12 12:21:58.764309 master-0 kubenswrapper[7320]: I0312 12:21:58.764291 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.770238 master-0 kubenswrapper[7320]: I0312 12:21:58.770200 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 12 12:21:58.798545 master-0 kubenswrapper[7320]: I0312 12:21:58.798513 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 12 12:21:58.812802 master-0 kubenswrapper[7320]: I0312 12:21:58.812771 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-var-lock\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.812939 master-0 kubenswrapper[7320]: I0312 12:21:58.812915 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc32576-d801-439b-aed4-40214599f785-kube-api-access\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.813178 master-0 kubenswrapper[7320]: I0312 12:21:58.813112 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.914351 master-0 kubenswrapper[7320]: I0312 12:21:58.914225 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.914561 master-0 kubenswrapper[7320]: I0312 12:21:58.914403 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.914561 master-0 kubenswrapper[7320]: I0312 12:21:58.914536 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-var-lock\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.914640 master-0 kubenswrapper[7320]: I0312 12:21:58.914618 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:21:58.914760 master-0 kubenswrapper[7320]: I0312 12:21:58.914711 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-var-lock\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.914819 master-0 kubenswrapper[7320]: I0312 12:21:58.914791 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc32576-d801-439b-aed4-40214599f785-kube-api-access\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:58.914916 master-0 kubenswrapper[7320]: E0312 12:21:58.914843 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:21:58.914916 master-0 kubenswrapper[7320]: E0312 12:21:58.914902 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca podName:d45da9d2-861a-497c-b237-290c70e65ef3 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:00.914881337 +0000 UTC m=+43.473925298 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca") pod "route-controller-manager-9bbcffc94-dpvxr" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3") : configmap "client-ca" not found Mar 12 12:21:58.930763 master-0 kubenswrapper[7320]: I0312 12:21:58.930733 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc32576-d801-439b-aed4-40214599f785-kube-api-access\") pod \"installer-2-master-0\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:59.092009 master-0 kubenswrapper[7320]: I0312 12:21:59.091950 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:21:59.764917 master-0 kubenswrapper[7320]: I0312 12:21:59.764859 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed45b99-434b-4580-a4cc-ac2f62d32268" path="/var/lib/kubelet/pods/bed45b99-434b-4580-a4cc-ac2f62d32268/volumes" Mar 12 12:22:00.083444 master-0 kubenswrapper[7320]: I0312 12:22:00.082673 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8"] Mar 12 12:22:00.083444 master-0 kubenswrapper[7320]: I0312 12:22:00.083444 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.087974 master-0 kubenswrapper[7320]: I0312 12:22:00.087932 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:22:00.088154 master-0 kubenswrapper[7320]: I0312 12:22:00.088116 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:22:00.088263 master-0 kubenswrapper[7320]: I0312 12:22:00.088246 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:22:00.088321 master-0 kubenswrapper[7320]: I0312 12:22:00.088259 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:22:00.089299 master-0 kubenswrapper[7320]: I0312 12:22:00.088799 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:22:00.090391 master-0 kubenswrapper[7320]: I0312 12:22:00.090345 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-7849849f76-86f2r"] Mar 12 12:22:00.091506 master-0 kubenswrapper[7320]: I0312 12:22:00.091461 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.093662 master-0 kubenswrapper[7320]: I0312 12:22:00.093129 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:22:00.096459 master-0 kubenswrapper[7320]: I0312 12:22:00.096432 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 12:22:00.096802 master-0 kubenswrapper[7320]: I0312 12:22:00.096779 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 12:22:00.098539 master-0 kubenswrapper[7320]: I0312 12:22:00.098514 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 12:22:00.098743 master-0 kubenswrapper[7320]: I0312 12:22:00.098722 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 12:22:00.098951 master-0 kubenswrapper[7320]: I0312 12:22:00.098886 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 12:22:00.099364 master-0 kubenswrapper[7320]: I0312 12:22:00.099311 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 12:22:00.099771 master-0 kubenswrapper[7320]: I0312 12:22:00.099744 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 12:22:00.100568 master-0 kubenswrapper[7320]: I0312 12:22:00.100538 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 12:22:00.101586 master-0 kubenswrapper[7320]: I0312 12:22:00.101541 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 12:22:00.101822 master-0 kubenswrapper[7320]: I0312 12:22:00.101787 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8"] Mar 12 12:22:00.103452 master-0 kubenswrapper[7320]: I0312 12:22:00.103406 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7849849f76-86f2r"] Mar 12 12:22:00.104521 master-0 kubenswrapper[7320]: I0312 12:22:00.104462 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 12:22:00.126963 master-0 kubenswrapper[7320]: I0312 12:22:00.126918 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbdcb\" (UniqueName: \"kubernetes.io/projected/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-kube-api-access-vbdcb\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.127113 master-0 kubenswrapper[7320]: I0312 12:22:00.126973 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit-dir\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127113 master-0 kubenswrapper[7320]: I0312 12:22:00.127096 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-node-pullsecrets\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127176 master-0 kubenswrapper[7320]: I0312 12:22:00.127148 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-serving-cert\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127213 master-0 kubenswrapper[7320]: I0312 12:22:00.127183 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgxbv\" (UniqueName: \"kubernetes.io/projected/c873b656-d2aa-4d0e-aa22-9f8d35186473-kube-api-access-kgxbv\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127245 master-0 kubenswrapper[7320]: I0312 12:22:00.127218 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-encryption-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127277 master-0 kubenswrapper[7320]: I0312 12:22:00.127256 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-client\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127317 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-trusted-ca-bundle\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127437 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127491 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-proxy-ca-bundles\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127524 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127559 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-serving-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127683 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-config\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127720 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127744 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-serving-cert\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.127899 master-0 kubenswrapper[7320]: I0312 12:22:00.127793 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-image-import-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.229825 master-0 kubenswrapper[7320]: I0312 12:22:00.229549 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxbv\" (UniqueName: \"kubernetes.io/projected/c873b656-d2aa-4d0e-aa22-9f8d35186473-kube-api-access-kgxbv\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230017 master-0 kubenswrapper[7320]: I0312 12:22:00.229869 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-encryption-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230168 master-0 kubenswrapper[7320]: I0312 12:22:00.230124 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-client\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230219 master-0 kubenswrapper[7320]: I0312 12:22:00.230202 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-trusted-ca-bundle\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230277 master-0 kubenswrapper[7320]: I0312 12:22:00.230245 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230277 master-0 kubenswrapper[7320]: I0312 12:22:00.230271 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-proxy-ca-bundles\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.230343 master-0 kubenswrapper[7320]: I0312 12:22:00.230303 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230343 master-0 kubenswrapper[7320]: I0312 12:22:00.230336 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-serving-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230414 master-0 kubenswrapper[7320]: I0312 12:22:00.230382 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-config\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.230453 master-0 kubenswrapper[7320]: I0312 12:22:00.230416 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.230453 master-0 kubenswrapper[7320]: I0312 12:22:00.230446 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-serving-cert\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.230547 master-0 kubenswrapper[7320]: I0312 12:22:00.230469 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-image-import-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.230935 master-0 kubenswrapper[7320]: I0312 12:22:00.230604 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbdcb\" (UniqueName: \"kubernetes.io/projected/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-kube-api-access-vbdcb\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.232111 master-0 kubenswrapper[7320]: E0312 12:22:00.232087 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: E0312 12:22:00.232149 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca podName:624d2bff-bb9a-45ea-ac5b-00bb55d758d7 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:00.732128637 +0000 UTC m=+43.291172528 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca") pod "controller-manager-5fc9f9dcb7-8f5f8" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7") : configmap "client-ca" not found Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.232277 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-config\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.232366 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-trusted-ca-bundle\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.232412 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit-dir\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.232559 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-node-pullsecrets\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.232677 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-serving-cert\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.232964 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-image-import-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.233870 master-0 kubenswrapper[7320]: I0312 12:22:00.233651 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.234619 master-0 kubenswrapper[7320]: I0312 12:22:00.234566 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-encryption-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.234719 master-0 kubenswrapper[7320]: I0312 12:22:00.234598 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-client\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.234719 master-0 kubenswrapper[7320]: I0312 12:22:00.234691 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit-dir\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.234802 master-0 kubenswrapper[7320]: I0312 12:22:00.234771 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-node-pullsecrets\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.234962 master-0 kubenswrapper[7320]: I0312 12:22:00.234922 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-proxy-ca-bundles\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.235116 master-0 kubenswrapper[7320]: I0312 12:22:00.235082 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.236135 master-0 kubenswrapper[7320]: I0312 12:22:00.236072 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-serving-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.240609 master-0 kubenswrapper[7320]: I0312 12:22:00.240556 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-serving-cert\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.246052 master-0 kubenswrapper[7320]: I0312 12:22:00.245882 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxbv\" (UniqueName: \"kubernetes.io/projected/c873b656-d2aa-4d0e-aa22-9f8d35186473-kube-api-access-kgxbv\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.248724 master-0 kubenswrapper[7320]: I0312 12:22:00.248686 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-serving-cert\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.250079 master-0 kubenswrapper[7320]: I0312 12:22:00.250055 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbdcb\" (UniqueName: \"kubernetes.io/projected/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-kube-api-access-vbdcb\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.432871 master-0 kubenswrapper[7320]: I0312 12:22:00.432748 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:00.737720 master-0 kubenswrapper[7320]: I0312 12:22:00.737667 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:00.737899 master-0 kubenswrapper[7320]: E0312 12:22:00.737843 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:00.737954 master-0 kubenswrapper[7320]: E0312 12:22:00.737916 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca podName:624d2bff-bb9a-45ea-ac5b-00bb55d758d7 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:01.737899078 +0000 UTC m=+44.296942959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca") pod "controller-manager-5fc9f9dcb7-8f5f8" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7") : configmap "client-ca" not found Mar 12 12:22:00.941340 master-0 kubenswrapper[7320]: I0312 12:22:00.941284 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:22:00.941836 master-0 kubenswrapper[7320]: E0312 12:22:00.941461 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:00.941836 master-0 kubenswrapper[7320]: E0312 12:22:00.941570 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca podName:d45da9d2-861a-497c-b237-290c70e65ef3 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:04.941546846 +0000 UTC m=+47.500590757 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca") pod "route-controller-manager-9bbcffc94-dpvxr" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3") : configmap "client-ca" not found Mar 12 12:22:01.520589 master-0 kubenswrapper[7320]: I0312 12:22:01.520137 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 12 12:22:01.767983 master-0 kubenswrapper[7320]: I0312 12:22:01.765078 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:01.767983 master-0 kubenswrapper[7320]: E0312 12:22:01.765294 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:01.767983 master-0 kubenswrapper[7320]: E0312 12:22:01.765351 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca podName:624d2bff-bb9a-45ea-ac5b-00bb55d758d7 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:03.765333697 +0000 UTC m=+46.324377588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca") pod "controller-manager-5fc9f9dcb7-8f5f8" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7") : configmap "client-ca" not found Mar 12 12:22:01.787633 master-0 kubenswrapper[7320]: I0312 12:22:01.787557 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 12 12:22:01.788426 master-0 kubenswrapper[7320]: I0312 12:22:01.788385 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-7849849f76-86f2r"] Mar 12 12:22:01.851533 master-0 kubenswrapper[7320]: I0312 12:22:01.847547 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-9zrvj"] Mar 12 12:22:01.851533 master-0 kubenswrapper[7320]: I0312 12:22:01.848182 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.967467 master-0 kubenswrapper[7320]: I0312 12:22:01.967173 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.967467 master-0 kubenswrapper[7320]: I0312 12:22:01.967463 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-conf\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967519 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-kubernetes\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967536 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-var-lib-kubelet\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967588 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-run\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967616 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysconfig\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967645 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-systemd\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967708 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-lib-modules\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967778 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-tmp\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967820 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhwbb\" (UniqueName: \"kubernetes.io/projected/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-kube-api-access-qhwbb\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967836 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-tuned\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967886 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-host\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967916 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-modprobe-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:01.968512 master-0 kubenswrapper[7320]: I0312 12:22:01.967936 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-sys\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.071987 master-0 kubenswrapper[7320]: I0312 12:22:02.071943 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-systemd\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072148 master-0 kubenswrapper[7320]: I0312 12:22:02.072003 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-lib-modules\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072148 master-0 kubenswrapper[7320]: I0312 12:22:02.072026 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-tmp\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072208 master-0 kubenswrapper[7320]: I0312 12:22:02.072138 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-systemd\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072276 master-0 kubenswrapper[7320]: I0312 12:22:02.072236 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwbb\" (UniqueName: \"kubernetes.io/projected/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-kube-api-access-qhwbb\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072348 master-0 kubenswrapper[7320]: I0312 12:22:02.072321 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-tuned\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072409 master-0 kubenswrapper[7320]: I0312 12:22:02.072391 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-host\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072455 master-0 kubenswrapper[7320]: I0312 12:22:02.072436 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-modprobe-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072510 master-0 kubenswrapper[7320]: I0312 12:22:02.072463 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-sys\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072510 master-0 kubenswrapper[7320]: I0312 12:22:02.072505 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072572 master-0 kubenswrapper[7320]: I0312 12:22:02.072529 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-conf\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072572 master-0 kubenswrapper[7320]: I0312 12:22:02.072566 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-kubernetes\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072631 master-0 kubenswrapper[7320]: I0312 12:22:02.072589 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-var-lib-kubelet\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072710 master-0 kubenswrapper[7320]: I0312 12:22:02.072683 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-run\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072762 master-0 kubenswrapper[7320]: I0312 12:22:02.072741 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysconfig\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.072924 master-0 kubenswrapper[7320]: I0312 12:22:02.072896 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysconfig\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.073826 master-0 kubenswrapper[7320]: I0312 12:22:02.073753 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.073875 master-0 kubenswrapper[7320]: I0312 12:22:02.073832 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-conf\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.073915 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-var-lib-kubelet\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.073965 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-kubernetes\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.073989 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-sys\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.074019 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-host\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.073998 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-run\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.074206 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-lib-modules\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.074496 master-0 kubenswrapper[7320]: I0312 12:22:02.074285 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-modprobe-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.080496 master-0 kubenswrapper[7320]: I0312 12:22:02.078699 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-tmp\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.082829 master-0 kubenswrapper[7320]: I0312 12:22:02.082787 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-tuned\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.098351 master-0 kubenswrapper[7320]: I0312 12:22:02.098308 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwbb\" (UniqueName: \"kubernetes.io/projected/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-kube-api-access-qhwbb\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.145146 master-0 kubenswrapper[7320]: I0312 12:22:02.145095 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"9cc32576-d801-439b-aed4-40214599f785","Type":"ContainerStarted","Data":"f8e1e52b4ed16453350c5b4c9e970a4da74486a269d74739928254c8b8e0a862"} Mar 12 12:22:02.145146 master-0 kubenswrapper[7320]: I0312 12:22:02.145147 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"9cc32576-d801-439b-aed4-40214599f785","Type":"ContainerStarted","Data":"c6f0a91271347e4cd47b779d079a6c1b6c52cb4b3e212d0940cc5e02899bf57c"} Mar 12 12:22:02.147059 master-0 kubenswrapper[7320]: I0312 12:22:02.147019 7320 generic.go:334] "Generic (PLEG): container finished" podID="a87e74f5-3187-44b7-8125-2349647dac7c" containerID="a1be6b26c4d2e309a5a9f64c63127ab28c54a1db65658ddc3700ee8495f82e1e" exitCode=0 Mar 12 12:22:02.147225 master-0 kubenswrapper[7320]: I0312 12:22:02.147206 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" event={"ID":"a87e74f5-3187-44b7-8125-2349647dac7c","Type":"ContainerDied","Data":"a1be6b26c4d2e309a5a9f64c63127ab28c54a1db65658ddc3700ee8495f82e1e"} Mar 12 12:22:02.148574 master-0 kubenswrapper[7320]: I0312 12:22:02.148542 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" event={"ID":"b9194868-75ce-4138-a9d4-ddd64660c529","Type":"ContainerStarted","Data":"6ff152a8cc30b3ca7b40abe31bb7a67d4765a151cbd91ac76f3274a4601e4bc4"} Mar 12 12:22:02.150683 master-0 kubenswrapper[7320]: I0312 12:22:02.150648 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1","Type":"ContainerStarted","Data":"c6ebc8cdb1ea535bf07be2b08b9bb0f2c20d1bfada4f7f7c593c77044d94f79a"} Mar 12 12:22:02.150683 master-0 kubenswrapper[7320]: I0312 12:22:02.150684 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1","Type":"ContainerStarted","Data":"dd601e94b2e497fb8ecd7edce09ab1fd613fe37dc7cd5a8b1f710f61827b3468"} Mar 12 12:22:02.152096 master-0 kubenswrapper[7320]: I0312 12:22:02.152061 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" event={"ID":"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7","Type":"ContainerStarted","Data":"d9158e2792bc5daa3d97edaa7adc8b6261f91ff0f769900d61941d5e7fc47a5b"} Mar 12 12:22:02.152096 master-0 kubenswrapper[7320]: I0312 12:22:02.152095 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" event={"ID":"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7","Type":"ContainerStarted","Data":"30a3ad00ad8b539daab9026a90d4e3988e2f4f85ee4901aab157d1221abec13c"} Mar 12 12:22:02.153103 master-0 kubenswrapper[7320]: I0312 12:22:02.153081 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7849849f76-86f2r" event={"ID":"c873b656-d2aa-4d0e-aa22-9f8d35186473","Type":"ContainerStarted","Data":"b5ba5b90d6ae03cfb78b058e1659ce46a060105bfebabaafdc41fb8977ded8b7"} Mar 12 12:22:02.153925 master-0 kubenswrapper[7320]: I0312 12:22:02.153904 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" event={"ID":"8e6f7496-1047-482d-9203-ff83a9eb7d93","Type":"ContainerStarted","Data":"cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676"} Mar 12 12:22:02.156053 master-0 kubenswrapper[7320]: I0312 12:22:02.156018 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" event={"ID":"cfd178d7-f518-413b-95ab-ab6687be6e0f","Type":"ContainerStarted","Data":"c725e2adf84dc5ad7765bcbefc340bfdc4d1be5e28386cdc2ec851ee8c8574d8"} Mar 12 12:22:02.157447 master-0 kubenswrapper[7320]: I0312 12:22:02.157412 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" event={"ID":"a22189f2-3f35-4ea6-9892-39a1b46637e2","Type":"ContainerStarted","Data":"da45d979944566f6767aede6064609f51b91e0d3720ab5e1102275898eb55317"} Mar 12 12:22:02.157447 master-0 kubenswrapper[7320]: I0312 12:22:02.157441 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" event={"ID":"a22189f2-3f35-4ea6-9892-39a1b46637e2","Type":"ContainerStarted","Data":"97f79ecdfa3c97644b3ca23d2c5dae1dd9db4d81745183dd21308d5c06844fd7"} Mar 12 12:22:02.162630 master-0 kubenswrapper[7320]: I0312 12:22:02.162542 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=4.162520446 podStartE2EDuration="4.162520446s" podCreationTimestamp="2026-03-12 12:21:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:02.161821616 +0000 UTC m=+44.720865497" watchObservedRunningTime="2026-03-12 12:22:02.162520446 +0000 UTC m=+44.721564327" Mar 12 12:22:02.196298 master-0 kubenswrapper[7320]: I0312 12:22:02.196238 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:22:02.222539 master-0 kubenswrapper[7320]: I0312 12:22:02.216576 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=5.215791317 podStartE2EDuration="5.215791317s" podCreationTimestamp="2026-03-12 12:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:02.191287679 +0000 UTC m=+44.750331560" watchObservedRunningTime="2026-03-12 12:22:02.215791317 +0000 UTC m=+44.774835198" Mar 12 12:22:02.377294 master-0 kubenswrapper[7320]: I0312 12:22:02.375654 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-k8t84"] Mar 12 12:22:02.377294 master-0 kubenswrapper[7320]: I0312 12:22:02.376604 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.381022 master-0 kubenswrapper[7320]: I0312 12:22:02.380429 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 12:22:02.381022 master-0 kubenswrapper[7320]: I0312 12:22:02.380793 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 12:22:02.381022 master-0 kubenswrapper[7320]: I0312 12:22:02.380947 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 12:22:02.381022 master-0 kubenswrapper[7320]: I0312 12:22:02.380978 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 12:22:02.392150 master-0 kubenswrapper[7320]: I0312 12:22:02.388923 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k8t84"] Mar 12 12:22:02.451468 master-0 kubenswrapper[7320]: I0312 12:22:02.445990 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:22:02.478048 master-0 kubenswrapper[7320]: I0312 12:22:02.477804 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njx9l\" (UniqueName: \"kubernetes.io/projected/9d47f860-d64a-49b8-b404-a67cbc2faeb6-kube-api-access-njx9l\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.478048 master-0 kubenswrapper[7320]: I0312 12:22:02.477866 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.478048 master-0 kubenswrapper[7320]: I0312 12:22:02.477905 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d47f860-d64a-49b8-b404-a67cbc2faeb6-metrics-tls\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.578527 master-0 kubenswrapper[7320]: I0312 12:22:02.578361 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-serving-cert\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.578801 master-0 kubenswrapper[7320]: I0312 12:22:02.578785 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-audit-policies\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.578944 master-0 kubenswrapper[7320]: I0312 12:22:02.578919 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-encryption-config\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.579088 master-0 kubenswrapper[7320]: I0312 12:22:02.579058 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-serving-ca\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.579190 master-0 kubenswrapper[7320]: I0312 12:22:02.579178 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-trusted-ca-bundle\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.579296 master-0 kubenswrapper[7320]: I0312 12:22:02.579280 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8tpp\" (UniqueName: \"kubernetes.io/projected/a87e74f5-3187-44b7-8125-2349647dac7c-kube-api-access-s8tpp\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.579418 master-0 kubenswrapper[7320]: I0312 12:22:02.579402 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a87e74f5-3187-44b7-8125-2349647dac7c-audit-dir\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.579564 master-0 kubenswrapper[7320]: I0312 12:22:02.579547 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-client\") pod \"a87e74f5-3187-44b7-8125-2349647dac7c\" (UID: \"a87e74f5-3187-44b7-8125-2349647dac7c\") " Mar 12 12:22:02.579934 master-0 kubenswrapper[7320]: I0312 12:22:02.579916 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njx9l\" (UniqueName: \"kubernetes.io/projected/9d47f860-d64a-49b8-b404-a67cbc2faeb6-kube-api-access-njx9l\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.580057 master-0 kubenswrapper[7320]: I0312 12:22:02.580045 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.580164 master-0 kubenswrapper[7320]: I0312 12:22:02.580148 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d47f860-d64a-49b8-b404-a67cbc2faeb6-metrics-tls\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.582277 master-0 kubenswrapper[7320]: I0312 12:22:02.582243 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:02.582404 master-0 kubenswrapper[7320]: I0312 12:22:02.582193 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:02.583077 master-0 kubenswrapper[7320]: I0312 12:22:02.583049 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.583147 master-0 kubenswrapper[7320]: I0312 12:22:02.583109 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a87e74f5-3187-44b7-8125-2349647dac7c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:02.588554 master-0 kubenswrapper[7320]: I0312 12:22:02.583526 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:02.588554 master-0 kubenswrapper[7320]: I0312 12:22:02.586626 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a87e74f5-3187-44b7-8125-2349647dac7c-kube-api-access-s8tpp" (OuterVolumeSpecName: "kube-api-access-s8tpp") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "kube-api-access-s8tpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:02.589704 master-0 kubenswrapper[7320]: I0312 12:22:02.589656 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:22:02.589832 master-0 kubenswrapper[7320]: I0312 12:22:02.589805 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d47f860-d64a-49b8-b404-a67cbc2faeb6-metrics-tls\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.590083 master-0 kubenswrapper[7320]: I0312 12:22:02.590044 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:22:02.595820 master-0 kubenswrapper[7320]: I0312 12:22:02.595781 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "a87e74f5-3187-44b7-8125-2349647dac7c" (UID: "a87e74f5-3187-44b7-8125-2349647dac7c"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:22:02.606556 master-0 kubenswrapper[7320]: I0312 12:22:02.605917 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njx9l\" (UniqueName: \"kubernetes.io/projected/9d47f860-d64a-49b8-b404-a67cbc2faeb6-kube-api-access-njx9l\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681880 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681913 7320 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681923 7320 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681933 7320 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681942 7320 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a87e74f5-3187-44b7-8125-2349647dac7c-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681952 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8tpp\" (UniqueName: \"kubernetes.io/projected/a87e74f5-3187-44b7-8125-2349647dac7c-kube-api-access-s8tpp\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.681938 master-0 kubenswrapper[7320]: I0312 12:22:02.681960 7320 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a87e74f5-3187-44b7-8125-2349647dac7c-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.682353 master-0 kubenswrapper[7320]: I0312 12:22:02.681970 7320 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a87e74f5-3187-44b7-8125-2349647dac7c-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:02.692607 master-0 kubenswrapper[7320]: I0312 12:22:02.692556 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-72w9q"] Mar 12 12:22:02.692788 master-0 kubenswrapper[7320]: E0312 12:22:02.692737 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a87e74f5-3187-44b7-8125-2349647dac7c" containerName="fix-audit-permissions" Mar 12 12:22:02.692788 master-0 kubenswrapper[7320]: I0312 12:22:02.692750 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="a87e74f5-3187-44b7-8125-2349647dac7c" containerName="fix-audit-permissions" Mar 12 12:22:02.692882 master-0 kubenswrapper[7320]: I0312 12:22:02.692817 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="a87e74f5-3187-44b7-8125-2349647dac7c" containerName="fix-audit-permissions" Mar 12 12:22:02.693121 master-0 kubenswrapper[7320]: I0312 12:22:02.693104 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.746081 master-0 kubenswrapper[7320]: I0312 12:22:02.746035 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:02.783117 master-0 kubenswrapper[7320]: I0312 12:22:02.783041 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36852fda-6aee-4a36-8724-537f1260c4c8-hosts-file\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.783289 master-0 kubenswrapper[7320]: I0312 12:22:02.783188 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54547\" (UniqueName: \"kubernetes.io/projected/36852fda-6aee-4a36-8724-537f1260c4c8-kube-api-access-54547\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.885422 master-0 kubenswrapper[7320]: I0312 12:22:02.884081 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54547\" (UniqueName: \"kubernetes.io/projected/36852fda-6aee-4a36-8724-537f1260c4c8-kube-api-access-54547\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.885422 master-0 kubenswrapper[7320]: I0312 12:22:02.884414 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36852fda-6aee-4a36-8724-537f1260c4c8-hosts-file\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.885422 master-0 kubenswrapper[7320]: I0312 12:22:02.884594 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36852fda-6aee-4a36-8724-537f1260c4c8-hosts-file\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.918980 master-0 kubenswrapper[7320]: I0312 12:22:02.918933 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54547\" (UniqueName: \"kubernetes.io/projected/36852fda-6aee-4a36-8724-537f1260c4c8-kube-api-access-54547\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:02.939493 master-0 kubenswrapper[7320]: I0312 12:22:02.939436 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-k8t84"] Mar 12 12:22:02.957598 master-0 kubenswrapper[7320]: W0312 12:22:02.957562 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d47f860_d64a_49b8_b404_a67cbc2faeb6.slice/crio-88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68 WatchSource:0}: Error finding container 88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68: Status 404 returned error can't find the container with id 88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68 Mar 12 12:22:03.010895 master-0 kubenswrapper[7320]: I0312 12:22:03.010855 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-72w9q" Mar 12 12:22:03.029065 master-0 kubenswrapper[7320]: W0312 12:22:03.028995 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36852fda_6aee_4a36_8724_537f1260c4c8.slice/crio-1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7 WatchSource:0}: Error finding container 1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7: Status 404 returned error can't find the container with id 1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7 Mar 12 12:22:03.167206 master-0 kubenswrapper[7320]: I0312 12:22:03.167073 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" event={"ID":"a87e74f5-3187-44b7-8125-2349647dac7c","Type":"ContainerDied","Data":"47b3552265d71c98eea3d38ead40654ad4166a125299885e5f9f87f8e67a66bc"} Mar 12 12:22:03.167206 master-0 kubenswrapper[7320]: I0312 12:22:03.167130 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-b86747447-pj5gz" Mar 12 12:22:03.167206 master-0 kubenswrapper[7320]: I0312 12:22:03.167151 7320 scope.go:117] "RemoveContainer" containerID="a1be6b26c4d2e309a5a9f64c63127ab28c54a1db65658ddc3700ee8495f82e1e" Mar 12 12:22:03.169322 master-0 kubenswrapper[7320]: I0312 12:22:03.168810 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-72w9q" event={"ID":"36852fda-6aee-4a36-8724-537f1260c4c8","Type":"ContainerStarted","Data":"1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7"} Mar 12 12:22:03.169446 master-0 kubenswrapper[7320]: I0312 12:22:03.169425 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8t84" event={"ID":"9d47f860-d64a-49b8-b404-a67cbc2faeb6","Type":"ContainerStarted","Data":"88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68"} Mar 12 12:22:03.173896 master-0 kubenswrapper[7320]: I0312 12:22:03.173859 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" event={"ID":"ed5a074c-e194-4b16-a4c9-0d82830bf7ca","Type":"ContainerStarted","Data":"19c31531ee5649cb576405b14aacddf21e36571f8eaf592b6167c6c63c9a860b"} Mar 12 12:22:03.173975 master-0 kubenswrapper[7320]: I0312 12:22:03.173904 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" event={"ID":"ed5a074c-e194-4b16-a4c9-0d82830bf7ca","Type":"ContainerStarted","Data":"317c3e727b1d477f7162b7d7b07ec5fb50aff41f9275c5fb657ce7cf379fb594"} Mar 12 12:22:03.194390 master-0 kubenswrapper[7320]: I0312 12:22:03.194326 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" podStartSLOduration=2.194291001 podStartE2EDuration="2.194291001s" podCreationTimestamp="2026-03-12 12:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:03.192732915 +0000 UTC m=+45.751776816" watchObservedRunningTime="2026-03-12 12:22:03.194291001 +0000 UTC m=+45.753334882" Mar 12 12:22:03.257680 master-0 kubenswrapper[7320]: I0312 12:22:03.257630 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-oauth-apiserver/apiserver-b86747447-pj5gz"] Mar 12 12:22:03.258167 master-0 kubenswrapper[7320]: I0312 12:22:03.258121 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-oauth-apiserver/apiserver-b86747447-pj5gz"] Mar 12 12:22:03.760747 master-0 kubenswrapper[7320]: I0312 12:22:03.760374 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a87e74f5-3187-44b7-8125-2349647dac7c" path="/var/lib/kubelet/pods/a87e74f5-3187-44b7-8125-2349647dac7c/volumes" Mar 12 12:22:03.798857 master-0 kubenswrapper[7320]: I0312 12:22:03.798231 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:03.799610 master-0 kubenswrapper[7320]: E0312 12:22:03.799156 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:03.799610 master-0 kubenswrapper[7320]: E0312 12:22:03.799236 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca podName:624d2bff-bb9a-45ea-ac5b-00bb55d758d7 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:07.799215128 +0000 UTC m=+50.358259009 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca") pod "controller-manager-5fc9f9dcb7-8f5f8" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7") : configmap "client-ca" not found Mar 12 12:22:04.181401 master-0 kubenswrapper[7320]: I0312 12:22:04.181225 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-72w9q" event={"ID":"36852fda-6aee-4a36-8724-537f1260c4c8","Type":"ContainerStarted","Data":"48e3e28e7f43029892f5e4f692d8a5b13e60298d5a5df7adb1bf2d5452a08031"} Mar 12 12:22:04.247932 master-0 kubenswrapper[7320]: I0312 12:22:04.247867 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4"] Mar 12 12:22:04.249073 master-0 kubenswrapper[7320]: I0312 12:22:04.249025 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.252665 master-0 kubenswrapper[7320]: I0312 12:22:04.252609 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 12:22:04.253764 master-0 kubenswrapper[7320]: I0312 12:22:04.253714 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 12:22:04.254351 master-0 kubenswrapper[7320]: I0312 12:22:04.254291 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 12:22:04.254351 master-0 kubenswrapper[7320]: I0312 12:22:04.254320 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 12:22:04.256562 master-0 kubenswrapper[7320]: I0312 12:22:04.256527 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 12:22:04.256951 master-0 kubenswrapper[7320]: I0312 12:22:04.256571 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 12:22:04.257465 master-0 kubenswrapper[7320]: I0312 12:22:04.257420 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 12:22:04.261383 master-0 kubenswrapper[7320]: I0312 12:22:04.261333 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 12:22:04.267467 master-0 kubenswrapper[7320]: I0312 12:22:04.267394 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-72w9q" podStartSLOduration=2.267374837 podStartE2EDuration="2.267374837s" podCreationTimestamp="2026-03-12 12:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:04.266529803 +0000 UTC m=+46.825573684" watchObservedRunningTime="2026-03-12 12:22:04.267374837 +0000 UTC m=+46.826418718" Mar 12 12:22:04.280662 master-0 kubenswrapper[7320]: I0312 12:22:04.280605 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4"] Mar 12 12:22:04.305783 master-0 kubenswrapper[7320]: I0312 12:22:04.305728 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/720101f1-0833-45af-a5b7-4910ece2a589-audit-dir\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.305783 master-0 kubenswrapper[7320]: I0312 12:22:04.305775 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-etcd-client\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.306077 master-0 kubenswrapper[7320]: I0312 12:22:04.305802 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-serving-cert\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.306077 master-0 kubenswrapper[7320]: I0312 12:22:04.305853 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcb5s\" (UniqueName: \"kubernetes.io/projected/720101f1-0833-45af-a5b7-4910ece2a589-kube-api-access-dcb5s\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.306077 master-0 kubenswrapper[7320]: I0312 12:22:04.305872 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-etcd-serving-ca\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.306077 master-0 kubenswrapper[7320]: I0312 12:22:04.305896 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-trusted-ca-bundle\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.306077 master-0 kubenswrapper[7320]: I0312 12:22:04.305941 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-encryption-config\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.306077 master-0 kubenswrapper[7320]: I0312 12:22:04.305959 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-audit-policies\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.406944 master-0 kubenswrapper[7320]: I0312 12:22:04.406802 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcb5s\" (UniqueName: \"kubernetes.io/projected/720101f1-0833-45af-a5b7-4910ece2a589-kube-api-access-dcb5s\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.406944 master-0 kubenswrapper[7320]: I0312 12:22:04.406857 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-etcd-serving-ca\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.407562 master-0 kubenswrapper[7320]: I0312 12:22:04.407144 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-trusted-ca-bundle\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.407562 master-0 kubenswrapper[7320]: I0312 12:22:04.407453 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-encryption-config\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.407916 master-0 kubenswrapper[7320]: I0312 12:22:04.407873 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-etcd-serving-ca\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.408062 master-0 kubenswrapper[7320]: I0312 12:22:04.408008 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-trusted-ca-bundle\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.408619 master-0 kubenswrapper[7320]: I0312 12:22:04.408581 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-audit-policies\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.408695 master-0 kubenswrapper[7320]: I0312 12:22:04.408642 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/720101f1-0833-45af-a5b7-4910ece2a589-audit-dir\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.408695 master-0 kubenswrapper[7320]: I0312 12:22:04.408666 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-etcd-client\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.408846 master-0 kubenswrapper[7320]: I0312 12:22:04.408772 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/720101f1-0833-45af-a5b7-4910ece2a589-audit-dir\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.408916 master-0 kubenswrapper[7320]: I0312 12:22:04.408898 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-serving-cert\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.412371 master-0 kubenswrapper[7320]: I0312 12:22:04.410107 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-audit-policies\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.413619 master-0 kubenswrapper[7320]: I0312 12:22:04.413579 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-serving-cert\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.413679 master-0 kubenswrapper[7320]: I0312 12:22:04.413647 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-encryption-config\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.416614 master-0 kubenswrapper[7320]: I0312 12:22:04.416578 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-etcd-client\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.432294 master-0 kubenswrapper[7320]: I0312 12:22:04.432180 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcb5s\" (UniqueName: \"kubernetes.io/projected/720101f1-0833-45af-a5b7-4910ece2a589-kube-api-access-dcb5s\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.586847 master-0 kubenswrapper[7320]: I0312 12:22:04.586741 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:04.971920 master-0 kubenswrapper[7320]: I0312 12:22:04.971813 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4"] Mar 12 12:22:04.979084 master-0 kubenswrapper[7320]: W0312 12:22:04.979027 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod720101f1_0833_45af_a5b7_4910ece2a589.slice/crio-b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747 WatchSource:0}: Error finding container b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747: Status 404 returned error can't find the container with id b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747 Mar 12 12:22:05.015927 master-0 kubenswrapper[7320]: I0312 12:22:05.015860 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:22:05.018393 master-0 kubenswrapper[7320]: E0312 12:22:05.016045 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:05.018393 master-0 kubenswrapper[7320]: E0312 12:22:05.016134 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca podName:d45da9d2-861a-497c-b237-290c70e65ef3 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:13.016083878 +0000 UTC m=+55.575127759 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca") pod "route-controller-manager-9bbcffc94-dpvxr" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3") : configmap "client-ca" not found Mar 12 12:22:05.211763 master-0 kubenswrapper[7320]: I0312 12:22:05.207863 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" event={"ID":"720101f1-0833-45af-a5b7-4910ece2a589","Type":"ContainerStarted","Data":"b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747"} Mar 12 12:22:05.954468 master-0 kubenswrapper[7320]: I0312 12:22:05.954391 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 12 12:22:05.954695 master-0 kubenswrapper[7320]: I0312 12:22:05.954647 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="9cc32576-d801-439b-aed4-40214599f785" containerName="installer" containerID="cri-o://f8e1e52b4ed16453350c5b4c9e970a4da74486a269d74739928254c8b8e0a862" gracePeriod=30 Mar 12 12:22:06.212141 master-0 kubenswrapper[7320]: I0312 12:22:06.212026 7320 generic.go:334] "Generic (PLEG): container finished" podID="720101f1-0833-45af-a5b7-4910ece2a589" containerID="f1911a04d932c171347c3c6692dcdee354bd0734087d8540b5936ed3d825696a" exitCode=0 Mar 12 12:22:06.212141 master-0 kubenswrapper[7320]: I0312 12:22:06.212081 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" event={"ID":"720101f1-0833-45af-a5b7-4910ece2a589","Type":"ContainerDied","Data":"f1911a04d932c171347c3c6692dcdee354bd0734087d8540b5936ed3d825696a"} Mar 12 12:22:06.643699 master-0 kubenswrapper[7320]: I0312 12:22:06.634622 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_9cc32576-d801-439b-aed4-40214599f785/installer/0.log" Mar 12 12:22:06.643699 master-0 kubenswrapper[7320]: I0312 12:22:06.634671 7320 generic.go:334] "Generic (PLEG): container finished" podID="9cc32576-d801-439b-aed4-40214599f785" containerID="f8e1e52b4ed16453350c5b4c9e970a4da74486a269d74739928254c8b8e0a862" exitCode=1 Mar 12 12:22:06.643699 master-0 kubenswrapper[7320]: I0312 12:22:06.634704 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"9cc32576-d801-439b-aed4-40214599f785","Type":"ContainerDied","Data":"f8e1e52b4ed16453350c5b4c9e970a4da74486a269d74739928254c8b8e0a862"} Mar 12 12:22:07.640754 master-0 kubenswrapper[7320]: I0312 12:22:07.640714 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8t84" event={"ID":"9d47f860-d64a-49b8-b404-a67cbc2faeb6","Type":"ContainerStarted","Data":"4d1f13ead8346ec5a069dd75d72093a65fa9217f0c4fdf81d4181ed7fb91bbcb"} Mar 12 12:22:07.642387 master-0 kubenswrapper[7320]: I0312 12:22:07.642363 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" event={"ID":"720101f1-0833-45af-a5b7-4910ece2a589","Type":"ContainerStarted","Data":"2a763674d7fee4b0bbe9e8926242eb3c7f1de3ff34487241dfdb9ec78e30a5d4"} Mar 12 12:22:07.804096 master-0 kubenswrapper[7320]: I0312 12:22:07.804045 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:07.804396 master-0 kubenswrapper[7320]: E0312 12:22:07.804356 7320 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:07.804459 master-0 kubenswrapper[7320]: E0312 12:22:07.804411 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca podName:624d2bff-bb9a-45ea-ac5b-00bb55d758d7 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:15.804398088 +0000 UTC m=+58.363441969 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca") pod "controller-manager-5fc9f9dcb7-8f5f8" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7") : configmap "client-ca" not found Mar 12 12:22:07.991051 master-0 kubenswrapper[7320]: I0312 12:22:07.991003 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 12 12:22:07.991936 master-0 kubenswrapper[7320]: I0312 12:22:07.991915 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.004567 master-0 kubenswrapper[7320]: I0312 12:22:08.003161 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 12 12:22:08.014877 master-0 kubenswrapper[7320]: I0312 12:22:08.014821 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.015040 master-0 kubenswrapper[7320]: I0312 12:22:08.014920 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a59a6bb7-f966-4208-ba85-452095404891-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.015040 master-0 kubenswrapper[7320]: I0312 12:22:08.015027 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-var-lock\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.043271 master-0 kubenswrapper[7320]: I0312 12:22:08.043197 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" podStartSLOduration=12.043173945 podStartE2EDuration="12.043173945s" podCreationTimestamp="2026-03-12 12:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:08.014530306 +0000 UTC m=+50.573574197" watchObservedRunningTime="2026-03-12 12:22:08.043173945 +0000 UTC m=+50.602217826" Mar 12 12:22:08.053764 master-0 kubenswrapper[7320]: I0312 12:22:08.053715 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_9cc32576-d801-439b-aed4-40214599f785/installer/0.log" Mar 12 12:22:08.053908 master-0 kubenswrapper[7320]: I0312 12:22:08.053812 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:22:08.120518 master-0 kubenswrapper[7320]: I0312 12:22:08.118024 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-var-lock\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.120518 master-0 kubenswrapper[7320]: I0312 12:22:08.118086 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.120518 master-0 kubenswrapper[7320]: I0312 12:22:08.118126 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a59a6bb7-f966-4208-ba85-452095404891-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.120518 master-0 kubenswrapper[7320]: I0312 12:22:08.118518 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-var-lock\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.120518 master-0 kubenswrapper[7320]: I0312 12:22:08.118551 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.160590 master-0 kubenswrapper[7320]: I0312 12:22:08.159573 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a59a6bb7-f966-4208-ba85-452095404891-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.219246 master-0 kubenswrapper[7320]: I0312 12:22:08.219197 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-var-lock\") pod \"9cc32576-d801-439b-aed4-40214599f785\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " Mar 12 12:22:08.219246 master-0 kubenswrapper[7320]: I0312 12:22:08.219255 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc32576-d801-439b-aed4-40214599f785-kube-api-access\") pod \"9cc32576-d801-439b-aed4-40214599f785\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " Mar 12 12:22:08.219469 master-0 kubenswrapper[7320]: I0312 12:22:08.219318 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-kubelet-dir\") pod \"9cc32576-d801-439b-aed4-40214599f785\" (UID: \"9cc32576-d801-439b-aed4-40214599f785\") " Mar 12 12:22:08.219469 master-0 kubenswrapper[7320]: I0312 12:22:08.219348 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-var-lock" (OuterVolumeSpecName: "var-lock") pod "9cc32576-d801-439b-aed4-40214599f785" (UID: "9cc32576-d801-439b-aed4-40214599f785"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:08.219600 master-0 kubenswrapper[7320]: I0312 12:22:08.219578 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:08.219685 master-0 kubenswrapper[7320]: I0312 12:22:08.219636 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9cc32576-d801-439b-aed4-40214599f785" (UID: "9cc32576-d801-439b-aed4-40214599f785"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:08.223509 master-0 kubenswrapper[7320]: I0312 12:22:08.223060 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc32576-d801-439b-aed4-40214599f785-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9cc32576-d801-439b-aed4-40214599f785" (UID: "9cc32576-d801-439b-aed4-40214599f785"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:08.322423 master-0 kubenswrapper[7320]: I0312 12:22:08.320284 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9cc32576-d801-439b-aed4-40214599f785-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:08.322423 master-0 kubenswrapper[7320]: I0312 12:22:08.320321 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9cc32576-d801-439b-aed4-40214599f785-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:08.393967 master-0 kubenswrapper[7320]: I0312 12:22:08.393913 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:08.649115 master-0 kubenswrapper[7320]: I0312 12:22:08.648959 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-k8t84" event={"ID":"9d47f860-d64a-49b8-b404-a67cbc2faeb6","Type":"ContainerStarted","Data":"576622ff798efc23c86d7234f313c65970e0566ff06d2b368943f0f858126f85"} Mar 12 12:22:08.649554 master-0 kubenswrapper[7320]: I0312 12:22:08.649128 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:08.651120 master-0 kubenswrapper[7320]: I0312 12:22:08.650717 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_9cc32576-d801-439b-aed4-40214599f785/installer/0.log" Mar 12 12:22:08.651120 master-0 kubenswrapper[7320]: I0312 12:22:08.650804 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"9cc32576-d801-439b-aed4-40214599f785","Type":"ContainerDied","Data":"c6f0a91271347e4cd47b779d079a6c1b6c52cb4b3e212d0940cc5e02899bf57c"} Mar 12 12:22:08.651120 master-0 kubenswrapper[7320]: I0312 12:22:08.650861 7320 scope.go:117] "RemoveContainer" containerID="f8e1e52b4ed16453350c5b4c9e970a4da74486a269d74739928254c8b8e0a862" Mar 12 12:22:08.651120 master-0 kubenswrapper[7320]: I0312 12:22:08.650805 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 12 12:22:08.731839 master-0 kubenswrapper[7320]: I0312 12:22:08.730578 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-k8t84" podStartSLOduration=3.57236305 podStartE2EDuration="6.730561939s" podCreationTimestamp="2026-03-12 12:22:02 +0000 UTC" firstStartedPulling="2026-03-12 12:22:02.960511531 +0000 UTC m=+45.519555412" lastFinishedPulling="2026-03-12 12:22:06.11871042 +0000 UTC m=+48.677754301" observedRunningTime="2026-03-12 12:22:08.728805507 +0000 UTC m=+51.287849408" watchObservedRunningTime="2026-03-12 12:22:08.730561939 +0000 UTC m=+51.289605820" Mar 12 12:22:08.745833 master-0 kubenswrapper[7320]: I0312 12:22:08.745760 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 12 12:22:08.751509 master-0 kubenswrapper[7320]: I0312 12:22:08.750441 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 12 12:22:08.762882 master-0 kubenswrapper[7320]: I0312 12:22:08.761498 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 12 12:22:08.791523 master-0 kubenswrapper[7320]: W0312 12:22:08.791439 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda59a6bb7_f966_4208_ba85_452095404891.slice/crio-d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8 WatchSource:0}: Error finding container d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8: Status 404 returned error can't find the container with id d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8 Mar 12 12:22:09.588506 master-0 kubenswrapper[7320]: I0312 12:22:09.587749 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:09.588506 master-0 kubenswrapper[7320]: I0312 12:22:09.587837 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:09.594304 master-0 kubenswrapper[7320]: I0312 12:22:09.594272 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:09.657406 master-0 kubenswrapper[7320]: I0312 12:22:09.657180 7320 generic.go:334] "Generic (PLEG): container finished" podID="c873b656-d2aa-4d0e-aa22-9f8d35186473" containerID="cff5fae1b95c8c229432e602087c75f134e2f63191ff0620438ff838fbf87945" exitCode=0 Mar 12 12:22:09.657406 master-0 kubenswrapper[7320]: I0312 12:22:09.657227 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7849849f76-86f2r" event={"ID":"c873b656-d2aa-4d0e-aa22-9f8d35186473","Type":"ContainerDied","Data":"cff5fae1b95c8c229432e602087c75f134e2f63191ff0620438ff838fbf87945"} Mar 12 12:22:09.659157 master-0 kubenswrapper[7320]: I0312 12:22:09.658995 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"a59a6bb7-f966-4208-ba85-452095404891","Type":"ContainerStarted","Data":"b0b0a71bb15ee38a2037cc0d67a425037c9a862e431396ce17c0501ae76f6aae"} Mar 12 12:22:09.659157 master-0 kubenswrapper[7320]: I0312 12:22:09.659058 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"a59a6bb7-f966-4208-ba85-452095404891","Type":"ContainerStarted","Data":"d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8"} Mar 12 12:22:09.665653 master-0 kubenswrapper[7320]: I0312 12:22:09.665621 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:22:09.693722 master-0 kubenswrapper[7320]: I0312 12:22:09.693633 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.69361623 podStartE2EDuration="2.69361623s" podCreationTimestamp="2026-03-12 12:22:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:09.691690564 +0000 UTC m=+52.250734445" watchObservedRunningTime="2026-03-12 12:22:09.69361623 +0000 UTC m=+52.252660111" Mar 12 12:22:09.768873 master-0 kubenswrapper[7320]: I0312 12:22:09.767189 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc32576-d801-439b-aed4-40214599f785" path="/var/lib/kubelet/pods/9cc32576-d801-439b-aed4-40214599f785/volumes" Mar 12 12:22:10.668416 master-0 kubenswrapper[7320]: I0312 12:22:10.668345 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7849849f76-86f2r" event={"ID":"c873b656-d2aa-4d0e-aa22-9f8d35186473","Type":"ContainerStarted","Data":"4689e35aed6bb114df244b70bd2d2c8403bfa3fb7600219c0438e0c1f79df4a2"} Mar 12 12:22:10.668416 master-0 kubenswrapper[7320]: I0312 12:22:10.668406 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-7849849f76-86f2r" event={"ID":"c873b656-d2aa-4d0e-aa22-9f8d35186473","Type":"ContainerStarted","Data":"f5b6fee106f3e83c51837060458c5c521b6afed033597b44c305dff52fd06c51"} Mar 12 12:22:10.696624 master-0 kubenswrapper[7320]: I0312 12:22:10.695685 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-7849849f76-86f2r" podStartSLOduration=8.436771931 podStartE2EDuration="14.695667034s" podCreationTimestamp="2026-03-12 12:21:56 +0000 UTC" firstStartedPulling="2026-03-12 12:22:01.822815451 +0000 UTC m=+44.381859332" lastFinishedPulling="2026-03-12 12:22:08.081710554 +0000 UTC m=+50.640754435" observedRunningTime="2026-03-12 12:22:10.694911452 +0000 UTC m=+53.253955333" watchObservedRunningTime="2026-03-12 12:22:10.695667034 +0000 UTC m=+53.254710915" Mar 12 12:22:12.647257 master-0 kubenswrapper[7320]: I0312 12:22:12.646330 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 12 12:22:12.647257 master-0 kubenswrapper[7320]: E0312 12:22:12.646614 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc32576-d801-439b-aed4-40214599f785" containerName="installer" Mar 12 12:22:12.647257 master-0 kubenswrapper[7320]: I0312 12:22:12.646664 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc32576-d801-439b-aed4-40214599f785" containerName="installer" Mar 12 12:22:12.647257 master-0 kubenswrapper[7320]: I0312 12:22:12.646749 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc32576-d801-439b-aed4-40214599f785" containerName="installer" Mar 12 12:22:12.647257 master-0 kubenswrapper[7320]: I0312 12:22:12.647201 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.650839 master-0 kubenswrapper[7320]: I0312 12:22:12.650710 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 12:22:12.668274 master-0 kubenswrapper[7320]: I0312 12:22:12.668186 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 12 12:22:12.720522 master-0 kubenswrapper[7320]: I0312 12:22:12.720378 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a381a5c1-050f-496f-b13f-eefa7310557f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.720522 master-0 kubenswrapper[7320]: I0312 12:22:12.720458 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-var-lock\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.720522 master-0 kubenswrapper[7320]: I0312 12:22:12.720513 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.821924 master-0 kubenswrapper[7320]: I0312 12:22:12.821855 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-var-lock\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.822169 master-0 kubenswrapper[7320]: I0312 12:22:12.821965 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-var-lock\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.822169 master-0 kubenswrapper[7320]: I0312 12:22:12.822082 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.822264 master-0 kubenswrapper[7320]: I0312 12:22:12.822208 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.822303 master-0 kubenswrapper[7320]: I0312 12:22:12.822288 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a381a5c1-050f-496f-b13f-eefa7310557f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.840528 master-0 kubenswrapper[7320]: I0312 12:22:12.837479 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a381a5c1-050f-496f-b13f-eefa7310557f-kube-api-access\") pod \"installer-1-master-0\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:12.976743 master-0 kubenswrapper[7320]: I0312 12:22:12.976697 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:13.023938 master-0 kubenswrapper[7320]: I0312 12:22:13.023843 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") pod \"route-controller-manager-9bbcffc94-dpvxr\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:22:13.024121 master-0 kubenswrapper[7320]: E0312 12:22:13.024022 7320 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 12 12:22:13.024121 master-0 kubenswrapper[7320]: E0312 12:22:13.024102 7320 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca podName:d45da9d2-861a-497c-b237-290c70e65ef3 nodeName:}" failed. No retries permitted until 2026-03-12 12:22:29.024078778 +0000 UTC m=+71.583122679 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca") pod "route-controller-manager-9bbcffc94-dpvxr" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3") : configmap "client-ca" not found Mar 12 12:22:13.371163 master-0 kubenswrapper[7320]: I0312 12:22:13.371008 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 12 12:22:13.685813 master-0 kubenswrapper[7320]: I0312 12:22:13.685714 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a381a5c1-050f-496f-b13f-eefa7310557f","Type":"ContainerStarted","Data":"e52f1e71eae22523cdfa825351c7af19e24ac0cd6dee50cd3b0de236bbadccfe"} Mar 12 12:22:14.690299 master-0 kubenswrapper[7320]: I0312 12:22:14.690229 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a381a5c1-050f-496f-b13f-eefa7310557f","Type":"ContainerStarted","Data":"57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4"} Mar 12 12:22:14.701593 master-0 kubenswrapper[7320]: I0312 12:22:14.700967 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=2.7009525070000002 podStartE2EDuration="2.700952507s" podCreationTimestamp="2026-03-12 12:22:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:14.700640808 +0000 UTC m=+57.259684689" watchObservedRunningTime="2026-03-12 12:22:14.700952507 +0000 UTC m=+57.259996388" Mar 12 12:22:15.382141 master-0 kubenswrapper[7320]: I0312 12:22:15.382007 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8"] Mar 12 12:22:15.382493 master-0 kubenswrapper[7320]: E0312 12:22:15.382440 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" podUID="624d2bff-bb9a-45ea-ac5b-00bb55d758d7" Mar 12 12:22:15.434371 master-0 kubenswrapper[7320]: I0312 12:22:15.434276 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:15.436776 master-0 kubenswrapper[7320]: I0312 12:22:15.435255 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:15.447471 master-0 kubenswrapper[7320]: I0312 12:22:15.447420 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr"] Mar 12 12:22:15.447832 master-0 kubenswrapper[7320]: E0312 12:22:15.447789 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" podUID="d45da9d2-861a-497c-b237-290c70e65ef3" Mar 12 12:22:15.448441 master-0 kubenswrapper[7320]: I0312 12:22:15.448391 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:15.696448 master-0 kubenswrapper[7320]: I0312 12:22:15.696249 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_e7c1a86e-0ad7-4978-80ae-163dbc44fafb/installer/0.log" Mar 12 12:22:15.696448 master-0 kubenswrapper[7320]: I0312 12:22:15.696314 7320 generic.go:334] "Generic (PLEG): container finished" podID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerID="b315bdbdd00e7e78faaebae53e6e5aca4dcfbe013781ad0113a093ac0097dc1b" exitCode=1 Mar 12 12:22:15.696448 master-0 kubenswrapper[7320]: I0312 12:22:15.696393 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:22:15.696972 master-0 kubenswrapper[7320]: I0312 12:22:15.696826 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e7c1a86e-0ad7-4978-80ae-163dbc44fafb","Type":"ContainerDied","Data":"b315bdbdd00e7e78faaebae53e6e5aca4dcfbe013781ad0113a093ac0097dc1b"} Mar 12 12:22:15.696972 master-0 kubenswrapper[7320]: I0312 12:22:15.696861 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e7c1a86e-0ad7-4978-80ae-163dbc44fafb","Type":"ContainerDied","Data":"cf3d062904eac9ebbd852d158da7968f568f0b7b439f11aee315609af8d30a5c"} Mar 12 12:22:15.696972 master-0 kubenswrapper[7320]: I0312 12:22:15.696876 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf3d062904eac9ebbd852d158da7968f568f0b7b439f11aee315609af8d30a5c" Mar 12 12:22:15.697785 master-0 kubenswrapper[7320]: I0312 12:22:15.697309 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:15.702005 master-0 kubenswrapper[7320]: I0312 12:22:15.701964 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:22:15.715222 master-0 kubenswrapper[7320]: I0312 12:22:15.714861 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_e7c1a86e-0ad7-4978-80ae-163dbc44fafb/installer/0.log" Mar 12 12:22:15.715222 master-0 kubenswrapper[7320]: I0312 12:22:15.714934 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:22:15.726806 master-0 kubenswrapper[7320]: I0312 12:22:15.726724 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:22:15.729285 master-0 kubenswrapper[7320]: I0312 12:22:15.729253 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:15.865609 master-0 kubenswrapper[7320]: I0312 12:22:15.865366 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vbdcb\" (UniqueName: \"kubernetes.io/projected/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-kube-api-access-vbdcb\") pod \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " Mar 12 12:22:15.865609 master-0 kubenswrapper[7320]: I0312 12:22:15.865430 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kubelet-dir\") pod \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " Mar 12 12:22:15.865609 master-0 kubenswrapper[7320]: I0312 12:22:15.865491 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-config\") pod \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " Mar 12 12:22:15.865609 master-0 kubenswrapper[7320]: I0312 12:22:15.865573 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e7c1a86e-0ad7-4978-80ae-163dbc44fafb" (UID: "e7c1a86e-0ad7-4978-80ae-163dbc44fafb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:15.865917 master-0 kubenswrapper[7320]: I0312 12:22:15.865636 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kube-api-access\") pod \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " Mar 12 12:22:15.865917 master-0 kubenswrapper[7320]: I0312 12:22:15.865701 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7qnt\" (UniqueName: \"kubernetes.io/projected/d45da9d2-861a-497c-b237-290c70e65ef3-kube-api-access-g7qnt\") pod \"d45da9d2-861a-497c-b237-290c70e65ef3\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " Mar 12 12:22:15.866417 master-0 kubenswrapper[7320]: I0312 12:22:15.866077 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-config" (OuterVolumeSpecName: "config") pod "624d2bff-bb9a-45ea-ac5b-00bb55d758d7" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:15.866417 master-0 kubenswrapper[7320]: I0312 12:22:15.866279 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-var-lock\") pod \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\" (UID: \"e7c1a86e-0ad7-4978-80ae-163dbc44fafb\") " Mar 12 12:22:15.866417 master-0 kubenswrapper[7320]: I0312 12:22:15.866309 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-var-lock" (OuterVolumeSpecName: "var-lock") pod "e7c1a86e-0ad7-4978-80ae-163dbc44fafb" (UID: "e7c1a86e-0ad7-4978-80ae-163dbc44fafb"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:15.866417 master-0 kubenswrapper[7320]: I0312 12:22:15.866316 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45da9d2-861a-497c-b237-290c70e65ef3-serving-cert\") pod \"d45da9d2-861a-497c-b237-290c70e65ef3\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " Mar 12 12:22:15.866417 master-0 kubenswrapper[7320]: I0312 12:22:15.866346 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-proxy-ca-bundles\") pod \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " Mar 12 12:22:15.866417 master-0 kubenswrapper[7320]: I0312 12:22:15.866372 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-serving-cert\") pod \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " Mar 12 12:22:15.866717 master-0 kubenswrapper[7320]: I0312 12:22:15.866436 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-config\") pod \"d45da9d2-861a-497c-b237-290c70e65ef3\" (UID: \"d45da9d2-861a-497c-b237-290c70e65ef3\") " Mar 12 12:22:15.866717 master-0 kubenswrapper[7320]: I0312 12:22:15.866580 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:15.866804 master-0 kubenswrapper[7320]: I0312 12:22:15.866794 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "624d2bff-bb9a-45ea-ac5b-00bb55d758d7" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:15.867020 master-0 kubenswrapper[7320]: I0312 12:22:15.866979 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-config" (OuterVolumeSpecName: "config") pod "d45da9d2-861a-497c-b237-290c70e65ef3" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:15.867232 master-0 kubenswrapper[7320]: I0312 12:22:15.867206 7320 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.867309 master-0 kubenswrapper[7320]: I0312 12:22:15.867235 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.867309 master-0 kubenswrapper[7320]: I0312 12:22:15.867251 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.867309 master-0 kubenswrapper[7320]: I0312 12:22:15.867264 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.867309 master-0 kubenswrapper[7320]: I0312 12:22:15.867279 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.867465 master-0 kubenswrapper[7320]: I0312 12:22:15.867337 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"controller-manager-5fc9f9dcb7-8f5f8\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:15.869645 master-0 kubenswrapper[7320]: I0312 12:22:15.869578 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7c1a86e-0ad7-4978-80ae-163dbc44fafb" (UID: "e7c1a86e-0ad7-4978-80ae-163dbc44fafb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:15.870133 master-0 kubenswrapper[7320]: I0312 12:22:15.870096 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-kube-api-access-vbdcb" (OuterVolumeSpecName: "kube-api-access-vbdcb") pod "624d2bff-bb9a-45ea-ac5b-00bb55d758d7" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7"). InnerVolumeSpecName "kube-api-access-vbdcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:15.870357 master-0 kubenswrapper[7320]: I0312 12:22:15.870322 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "624d2bff-bb9a-45ea-ac5b-00bb55d758d7" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:22:15.870656 master-0 kubenswrapper[7320]: I0312 12:22:15.870616 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d45da9d2-861a-497c-b237-290c70e65ef3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d45da9d2-861a-497c-b237-290c70e65ef3" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:22:15.871092 master-0 kubenswrapper[7320]: I0312 12:22:15.871053 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d45da9d2-861a-497c-b237-290c70e65ef3-kube-api-access-g7qnt" (OuterVolumeSpecName: "kube-api-access-g7qnt") pod "d45da9d2-861a-497c-b237-290c70e65ef3" (UID: "d45da9d2-861a-497c-b237-290c70e65ef3"). InnerVolumeSpecName "kube-api-access-g7qnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:15.968783 master-0 kubenswrapper[7320]: I0312 12:22:15.968725 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") pod \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\" (UID: \"624d2bff-bb9a-45ea-ac5b-00bb55d758d7\") " Mar 12 12:22:15.969132 master-0 kubenswrapper[7320]: I0312 12:22:15.969109 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7c1a86e-0ad7-4978-80ae-163dbc44fafb-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.969174 master-0 kubenswrapper[7320]: I0312 12:22:15.969131 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7qnt\" (UniqueName: \"kubernetes.io/projected/d45da9d2-861a-497c-b237-290c70e65ef3-kube-api-access-g7qnt\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.969174 master-0 kubenswrapper[7320]: I0312 12:22:15.969148 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d45da9d2-861a-497c-b237-290c70e65ef3-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.969174 master-0 kubenswrapper[7320]: I0312 12:22:15.969168 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:15.969263 master-0 kubenswrapper[7320]: I0312 12:22:15.969137 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca" (OuterVolumeSpecName: "client-ca") pod "624d2bff-bb9a-45ea-ac5b-00bb55d758d7" (UID: "624d2bff-bb9a-45ea-ac5b-00bb55d758d7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:15.969263 master-0 kubenswrapper[7320]: I0312 12:22:15.969182 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vbdcb\" (UniqueName: \"kubernetes.io/projected/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-kube-api-access-vbdcb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:16.070676 master-0 kubenswrapper[7320]: I0312 12:22:16.070613 7320 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/624d2bff-bb9a-45ea-ac5b-00bb55d758d7-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:16.305858 master-0 kubenswrapper[7320]: I0312 12:22:16.305729 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49"] Mar 12 12:22:16.306073 master-0 kubenswrapper[7320]: I0312 12:22:16.305920 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" podUID="8e6f7496-1047-482d-9203-ff83a9eb7d93" containerName="cluster-version-operator" containerID="cri-o://cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676" gracePeriod=130 Mar 12 12:22:16.701561 master-0 kubenswrapper[7320]: I0312 12:22:16.701471 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8" Mar 12 12:22:16.702001 master-0 kubenswrapper[7320]: I0312 12:22:16.701636 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 12 12:22:16.704814 master-0 kubenswrapper[7320]: I0312 12:22:16.704771 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr" Mar 12 12:22:16.776848 master-0 kubenswrapper[7320]: I0312 12:22:16.776790 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr"] Mar 12 12:22:16.776848 master-0 kubenswrapper[7320]: I0312 12:22:16.776847 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9bbcffc94-dpvxr"] Mar 12 12:22:16.794995 master-0 kubenswrapper[7320]: I0312 12:22:16.794933 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8"] Mar 12 12:22:16.798793 master-0 kubenswrapper[7320]: I0312 12:22:16.798755 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fc9f9dcb7-8f5f8"] Mar 12 12:22:16.812146 master-0 kubenswrapper[7320]: I0312 12:22:16.811708 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 12 12:22:16.815268 master-0 kubenswrapper[7320]: I0312 12:22:16.815156 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 12 12:22:16.885151 master-0 kubenswrapper[7320]: I0312 12:22:16.885083 7320 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d45da9d2-861a-497c-b237-290c70e65ef3-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:17.048780 master-0 kubenswrapper[7320]: I0312 12:22:17.048731 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:22:17.087546 master-0 kubenswrapper[7320]: I0312 12:22:17.087433 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") pod \"8e6f7496-1047-482d-9203-ff83a9eb7d93\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " Mar 12 12:22:17.087717 master-0 kubenswrapper[7320]: I0312 12:22:17.087660 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") pod \"8e6f7496-1047-482d-9203-ff83a9eb7d93\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " Mar 12 12:22:17.087783 master-0 kubenswrapper[7320]: I0312 12:22:17.087755 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") pod \"8e6f7496-1047-482d-9203-ff83a9eb7d93\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " Mar 12 12:22:17.087900 master-0 kubenswrapper[7320]: I0312 12:22:17.087559 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "8e6f7496-1047-482d-9203-ff83a9eb7d93" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:17.087932 master-0 kubenswrapper[7320]: I0312 12:22:17.087908 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") pod \"8e6f7496-1047-482d-9203-ff83a9eb7d93\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " Mar 12 12:22:17.087990 master-0 kubenswrapper[7320]: I0312 12:22:17.087963 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") pod \"8e6f7496-1047-482d-9203-ff83a9eb7d93\" (UID: \"8e6f7496-1047-482d-9203-ff83a9eb7d93\") " Mar 12 12:22:17.088200 master-0 kubenswrapper[7320]: I0312 12:22:17.088159 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca" (OuterVolumeSpecName: "service-ca") pod "8e6f7496-1047-482d-9203-ff83a9eb7d93" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:22:17.088311 master-0 kubenswrapper[7320]: I0312 12:22:17.088284 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "8e6f7496-1047-482d-9203-ff83a9eb7d93" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:17.088551 master-0 kubenswrapper[7320]: I0312 12:22:17.088520 7320 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:17.088587 master-0 kubenswrapper[7320]: I0312 12:22:17.088563 7320 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8e6f7496-1047-482d-9203-ff83a9eb7d93-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:17.088623 master-0 kubenswrapper[7320]: I0312 12:22:17.088594 7320 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8e6f7496-1047-482d-9203-ff83a9eb7d93-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:17.090997 master-0 kubenswrapper[7320]: I0312 12:22:17.090959 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8e6f7496-1047-482d-9203-ff83a9eb7d93" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:17.091244 master-0 kubenswrapper[7320]: I0312 12:22:17.091219 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8e6f7496-1047-482d-9203-ff83a9eb7d93" (UID: "8e6f7496-1047-482d-9203-ff83a9eb7d93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:22:17.189519 master-0 kubenswrapper[7320]: I0312 12:22:17.189447 7320 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8e6f7496-1047-482d-9203-ff83a9eb7d93-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:17.189519 master-0 kubenswrapper[7320]: I0312 12:22:17.189505 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8e6f7496-1047-482d-9203-ff83a9eb7d93-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:17.383662 master-0 kubenswrapper[7320]: I0312 12:22:17.383463 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf"] Mar 12 12:22:17.383906 master-0 kubenswrapper[7320]: E0312 12:22:17.383679 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e6f7496-1047-482d-9203-ff83a9eb7d93" containerName="cluster-version-operator" Mar 12 12:22:17.383906 master-0 kubenswrapper[7320]: I0312 12:22:17.383702 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e6f7496-1047-482d-9203-ff83a9eb7d93" containerName="cluster-version-operator" Mar 12 12:22:17.383906 master-0 kubenswrapper[7320]: E0312 12:22:17.383717 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" Mar 12 12:22:17.383906 master-0 kubenswrapper[7320]: I0312 12:22:17.383724 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" Mar 12 12:22:17.383906 master-0 kubenswrapper[7320]: I0312 12:22:17.383792 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e6f7496-1047-482d-9203-ff83a9eb7d93" containerName="cluster-version-operator" Mar 12 12:22:17.383906 master-0 kubenswrapper[7320]: I0312 12:22:17.383811 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" Mar 12 12:22:17.384692 master-0 kubenswrapper[7320]: I0312 12:22:17.384114 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.392693 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.393009 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.393257 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f87d47d96-c24tv"] Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.393854 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.393266 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.393318 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 12:22:17.396569 master-0 kubenswrapper[7320]: I0312 12:22:17.393745 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 12:22:17.400893 master-0 kubenswrapper[7320]: I0312 12:22:17.398708 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:22:17.400893 master-0 kubenswrapper[7320]: I0312 12:22:17.398991 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:22:17.400893 master-0 kubenswrapper[7320]: I0312 12:22:17.399146 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:22:17.400893 master-0 kubenswrapper[7320]: I0312 12:22:17.399401 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:22:17.400893 master-0 kubenswrapper[7320]: I0312 12:22:17.399592 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:22:17.413526 master-0 kubenswrapper[7320]: I0312 12:22:17.410316 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:22:17.426514 master-0 kubenswrapper[7320]: I0312 12:22:17.423446 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f87d47d96-c24tv"] Mar 12 12:22:17.464546 master-0 kubenswrapper[7320]: I0312 12:22:17.460625 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf"] Mar 12 12:22:17.492817 master-0 kubenswrapper[7320]: I0312 12:22:17.492759 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.492817 master-0 kubenswrapper[7320]: I0312 12:22:17.492804 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.493053 master-0 kubenswrapper[7320]: I0312 12:22:17.492856 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.493053 master-0 kubenswrapper[7320]: I0312 12:22:17.492900 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.493053 master-0 kubenswrapper[7320]: I0312 12:22:17.493005 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.493171 master-0 kubenswrapper[7320]: I0312 12:22:17.493093 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.493171 master-0 kubenswrapper[7320]: I0312 12:22:17.493136 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.493252 master-0 kubenswrapper[7320]: I0312 12:22:17.493227 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.493292 master-0 kubenswrapper[7320]: I0312 12:22:17.493252 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.603911 master-0 kubenswrapper[7320]: I0312 12:22:17.603690 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.603911 master-0 kubenswrapper[7320]: I0312 12:22:17.603779 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.603911 master-0 kubenswrapper[7320]: I0312 12:22:17.603817 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.603911 master-0 kubenswrapper[7320]: I0312 12:22:17.603885 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.603911 master-0 kubenswrapper[7320]: I0312 12:22:17.603912 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.604832 master-0 kubenswrapper[7320]: I0312 12:22:17.603963 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.604832 master-0 kubenswrapper[7320]: I0312 12:22:17.604013 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.604832 master-0 kubenswrapper[7320]: I0312 12:22:17.604085 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.604832 master-0 kubenswrapper[7320]: I0312 12:22:17.604822 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.609955 master-0 kubenswrapper[7320]: I0312 12:22:17.609051 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.612674 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.613748 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.614115 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.614229 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.616297 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.616963 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.617791 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.618206 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.618360 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.618546 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.621604 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.629605 master-0 kubenswrapper[7320]: I0312 12:22:17.626934 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.632727 master-0 kubenswrapper[7320]: I0312 12:22:17.632690 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:22:17.633418 master-0 kubenswrapper[7320]: I0312 12:22:17.633367 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.639537 master-0 kubenswrapper[7320]: I0312 12:22:17.638797 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 12:22:17.643534 master-0 kubenswrapper[7320]: I0312 12:22:17.643461 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:22:17.648607 master-0 kubenswrapper[7320]: I0312 12:22:17.648577 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 12:22:17.657082 master-0 kubenswrapper[7320]: I0312 12:22:17.657031 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.661977 master-0 kubenswrapper[7320]: I0312 12:22:17.661938 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.706122 master-0 kubenswrapper[7320]: I0312 12:22:17.706058 7320 generic.go:334] "Generic (PLEG): container finished" podID="8e6f7496-1047-482d-9203-ff83a9eb7d93" containerID="cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676" exitCode=0 Mar 12 12:22:17.706122 master-0 kubenswrapper[7320]: I0312 12:22:17.706102 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" Mar 12 12:22:17.706122 master-0 kubenswrapper[7320]: I0312 12:22:17.706128 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" event={"ID":"8e6f7496-1047-482d-9203-ff83a9eb7d93","Type":"ContainerDied","Data":"cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676"} Mar 12 12:22:17.706997 master-0 kubenswrapper[7320]: I0312 12:22:17.706156 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49" event={"ID":"8e6f7496-1047-482d-9203-ff83a9eb7d93","Type":"ContainerDied","Data":"d8d6e95189dd09735480728d654d01869ba9243b35fbe6a700a9456de7ccf78e"} Mar 12 12:22:17.706997 master-0 kubenswrapper[7320]: I0312 12:22:17.706173 7320 scope.go:117] "RemoveContainer" containerID="cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676" Mar 12 12:22:17.721780 master-0 kubenswrapper[7320]: I0312 12:22:17.721728 7320 scope.go:117] "RemoveContainer" containerID="cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676" Mar 12 12:22:17.721967 master-0 kubenswrapper[7320]: I0312 12:22:17.721783 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:17.722093 master-0 kubenswrapper[7320]: E0312 12:22:17.722071 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676\": container with ID starting with cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676 not found: ID does not exist" containerID="cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676" Mar 12 12:22:17.722267 master-0 kubenswrapper[7320]: I0312 12:22:17.722105 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676"} err="failed to get container status \"cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676\": rpc error: code = NotFound desc = could not find container \"cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676\": container with ID starting with cc90016cc4d18330cb01179c556b738346a71276b326896039a57550c7da3676 not found: ID does not exist" Mar 12 12:22:17.746367 master-0 kubenswrapper[7320]: I0312 12:22:17.744813 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:17.757348 master-0 kubenswrapper[7320]: I0312 12:22:17.757283 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k8t84" Mar 12 12:22:17.770315 master-0 kubenswrapper[7320]: I0312 12:22:17.770231 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="624d2bff-bb9a-45ea-ac5b-00bb55d758d7" path="/var/lib/kubelet/pods/624d2bff-bb9a-45ea-ac5b-00bb55d758d7/volumes" Mar 12 12:22:17.770830 master-0 kubenswrapper[7320]: I0312 12:22:17.770779 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d45da9d2-861a-497c-b237-290c70e65ef3" path="/var/lib/kubelet/pods/d45da9d2-861a-497c-b237-290c70e65ef3/volumes" Mar 12 12:22:17.771355 master-0 kubenswrapper[7320]: I0312 12:22:17.771175 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" path="/var/lib/kubelet/pods/e7c1a86e-0ad7-4978-80ae-163dbc44fafb/volumes" Mar 12 12:22:17.771990 master-0 kubenswrapper[7320]: I0312 12:22:17.771795 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49"] Mar 12 12:22:17.771990 master-0 kubenswrapper[7320]: I0312 12:22:17.771832 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-b2t49"] Mar 12 12:22:17.814505 master-0 kubenswrapper[7320]: I0312 12:22:17.813858 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w"] Mar 12 12:22:17.814705 master-0 kubenswrapper[7320]: I0312 12:22:17.814563 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:17.823040 master-0 kubenswrapper[7320]: I0312 12:22:17.818702 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 12:22:17.823040 master-0 kubenswrapper[7320]: I0312 12:22:17.818744 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 12:22:17.823040 master-0 kubenswrapper[7320]: I0312 12:22:17.818980 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 12:22:17.908180 master-0 kubenswrapper[7320]: I0312 12:22:17.907806 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:17.908180 master-0 kubenswrapper[7320]: I0312 12:22:17.907917 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:17.908180 master-0 kubenswrapper[7320]: I0312 12:22:17.908020 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:17.908180 master-0 kubenswrapper[7320]: I0312 12:22:17.908058 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:17.908180 master-0 kubenswrapper[7320]: I0312 12:22:17.908099 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.009757 master-0 kubenswrapper[7320]: I0312 12:22:18.009703 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.009757 master-0 kubenswrapper[7320]: I0312 12:22:18.009597 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.010029 master-0 kubenswrapper[7320]: I0312 12:22:18.009782 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.010029 master-0 kubenswrapper[7320]: I0312 12:22:18.009899 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.010029 master-0 kubenswrapper[7320]: I0312 12:22:18.009968 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.010029 master-0 kubenswrapper[7320]: I0312 12:22:18.010031 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.010430 master-0 kubenswrapper[7320]: I0312 12:22:18.010406 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.012201 master-0 kubenswrapper[7320]: I0312 12:22:18.012150 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.014108 master-0 kubenswrapper[7320]: I0312 12:22:18.014073 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.030096 master-0 kubenswrapper[7320]: I0312 12:22:18.030041 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.141043 master-0 kubenswrapper[7320]: I0312 12:22:18.140934 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:22:18.153642 master-0 kubenswrapper[7320]: W0312 12:22:18.153597 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod114b1d16_b37d_449c_84e3_3fb3f8b20eaa.slice/crio-3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457 WatchSource:0}: Error finding container 3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457: Status 404 returned error can't find the container with id 3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457 Mar 12 12:22:18.171543 master-0 kubenswrapper[7320]: I0312 12:22:18.171500 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf"] Mar 12 12:22:18.177925 master-0 kubenswrapper[7320]: W0312 12:22:18.177856 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod021b22e3_b4c5_426d_b761_181f1e54175d.slice/crio-de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32 WatchSource:0}: Error finding container de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32: Status 404 returned error can't find the container with id de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32 Mar 12 12:22:18.242924 master-0 kubenswrapper[7320]: I0312 12:22:18.242865 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f87d47d96-c24tv"] Mar 12 12:22:18.255144 master-0 kubenswrapper[7320]: W0312 12:22:18.255081 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce1515a8_5e96_4b3b_b2e0_b764e5a25dd0.slice/crio-76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9 WatchSource:0}: Error finding container 76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9: Status 404 returned error can't find the container with id 76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9 Mar 12 12:22:18.716065 master-0 kubenswrapper[7320]: I0312 12:22:18.716002 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" event={"ID":"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0","Type":"ContainerStarted","Data":"76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9"} Mar 12 12:22:18.717454 master-0 kubenswrapper[7320]: I0312 12:22:18.717220 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" event={"ID":"021b22e3-b4c5-426d-b761-181f1e54175d","Type":"ContainerStarted","Data":"de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32"} Mar 12 12:22:18.720066 master-0 kubenswrapper[7320]: I0312 12:22:18.719822 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" event={"ID":"114b1d16-b37d-449c-84e3-3fb3f8b20eaa","Type":"ContainerStarted","Data":"d4a43fbaaf0cd2d913ef5abe0c226480d6e639296585062ad2bcfe4ba7c6658f"} Mar 12 12:22:18.720066 master-0 kubenswrapper[7320]: I0312 12:22:18.719866 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" event={"ID":"114b1d16-b37d-449c-84e3-3fb3f8b20eaa","Type":"ContainerStarted","Data":"3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457"} Mar 12 12:22:18.747503 master-0 kubenswrapper[7320]: I0312 12:22:18.745578 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" podStartSLOduration=1.745560461 podStartE2EDuration="1.745560461s" podCreationTimestamp="2026-03-12 12:22:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:18.744771928 +0000 UTC m=+61.303815819" watchObservedRunningTime="2026-03-12 12:22:18.745560461 +0000 UTC m=+61.304604352" Mar 12 12:22:19.764509 master-0 kubenswrapper[7320]: I0312 12:22:19.764300 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6f7496-1047-482d-9203-ff83a9eb7d93" path="/var/lib/kubelet/pods/8e6f7496-1047-482d-9203-ff83a9eb7d93/volumes" Mar 12 12:22:21.153583 master-0 kubenswrapper[7320]: I0312 12:22:21.153536 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 12 12:22:21.154432 master-0 kubenswrapper[7320]: I0312 12:22:21.153748 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" containerID="cri-o://b0b0a71bb15ee38a2037cc0d67a425037c9a862e431396ce17c0501ae76f6aae" gracePeriod=30 Mar 12 12:22:22.381023 master-0 kubenswrapper[7320]: I0312 12:22:22.380966 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:22:22.381675 master-0 kubenswrapper[7320]: I0312 12:22:22.381642 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:22:22.385520 master-0 kubenswrapper[7320]: I0312 12:22:22.384176 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:22:22.385520 master-0 kubenswrapper[7320]: I0312 12:22:22.384752 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:22:22.482293 master-0 kubenswrapper[7320]: I0312 12:22:22.482224 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:22:22.482553 master-0 kubenswrapper[7320]: I0312 12:22:22.482489 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:22:22.482625 master-0 kubenswrapper[7320]: I0312 12:22:22.482558 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:22:22.482625 master-0 kubenswrapper[7320]: I0312 12:22:22.482592 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:22:22.482625 master-0 kubenswrapper[7320]: I0312 12:22:22.482618 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:22:22.482748 master-0 kubenswrapper[7320]: I0312 12:22:22.482647 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:22:22.487583 master-0 kubenswrapper[7320]: I0312 12:22:22.487511 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:22:22.487583 master-0 kubenswrapper[7320]: I0312 12:22:22.487570 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:22:22.487776 master-0 kubenswrapper[7320]: I0312 12:22:22.487625 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:22:22.487776 master-0 kubenswrapper[7320]: I0312 12:22:22.487678 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:22:22.489542 master-0 kubenswrapper[7320]: I0312 12:22:22.489498 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:22:22.490635 master-0 kubenswrapper[7320]: I0312 12:22:22.490595 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:22:22.495521 master-0 kubenswrapper[7320]: I0312 12:22:22.495430 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:22:22.495521 master-0 kubenswrapper[7320]: I0312 12:22:22.495463 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:22:22.497316 master-0 kubenswrapper[7320]: I0312 12:22:22.497282 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:22:22.750396 master-0 kubenswrapper[7320]: I0312 12:22:22.749462 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" event={"ID":"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0","Type":"ContainerStarted","Data":"cddd485ac51118295f95288e1b47d7560d221f73530c7f79d262c83d31f5faa3"} Mar 12 12:22:22.750396 master-0 kubenswrapper[7320]: I0312 12:22:22.749898 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:22.752882 master-0 kubenswrapper[7320]: I0312 12:22:22.752838 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" event={"ID":"021b22e3-b4c5-426d-b761-181f1e54175d","Type":"ContainerStarted","Data":"69a3615121435266c91b06fdc9e703ea81cecb073e9b8530439411a8ada925fe"} Mar 12 12:22:22.753289 master-0 kubenswrapper[7320]: I0312 12:22:22.753255 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:22.758460 master-0 kubenswrapper[7320]: I0312 12:22:22.756675 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:22:22.760257 master-0 kubenswrapper[7320]: I0312 12:22:22.760213 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:22:22.788154 master-0 kubenswrapper[7320]: I0312 12:22:22.785442 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:22:22.788154 master-0 kubenswrapper[7320]: I0312 12:22:22.786763 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:22:22.802101 master-0 kubenswrapper[7320]: I0312 12:22:22.800712 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" podStartSLOduration=4.206398046 podStartE2EDuration="7.800693853s" podCreationTimestamp="2026-03-12 12:22:15 +0000 UTC" firstStartedPulling="2026-03-12 12:22:18.258461457 +0000 UTC m=+60.817505338" lastFinishedPulling="2026-03-12 12:22:21.852757264 +0000 UTC m=+64.411801145" observedRunningTime="2026-03-12 12:22:22.798686944 +0000 UTC m=+65.357730835" watchObservedRunningTime="2026-03-12 12:22:22.800693853 +0000 UTC m=+65.359737734" Mar 12 12:22:22.842810 master-0 kubenswrapper[7320]: I0312 12:22:22.842671 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" podStartSLOduration=4.188071348 podStartE2EDuration="7.842646732s" podCreationTimestamp="2026-03-12 12:22:15 +0000 UTC" firstStartedPulling="2026-03-12 12:22:18.180185423 +0000 UTC m=+60.739229304" lastFinishedPulling="2026-03-12 12:22:21.834760807 +0000 UTC m=+64.393804688" observedRunningTime="2026-03-12 12:22:22.839102548 +0000 UTC m=+65.398146429" watchObservedRunningTime="2026-03-12 12:22:22.842646732 +0000 UTC m=+65.401690613" Mar 12 12:22:23.128360 master-0 kubenswrapper[7320]: I0312 12:22:23.125634 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v"] Mar 12 12:22:23.156103 master-0 kubenswrapper[7320]: I0312 12:22:23.152502 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-xpzn2"] Mar 12 12:22:23.203846 master-0 kubenswrapper[7320]: I0312 12:22:23.203647 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 12 12:22:23.204994 master-0 kubenswrapper[7320]: I0312 12:22:23.204964 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.206459 master-0 kubenswrapper[7320]: I0312 12:22:23.206386 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 12 12:22:23.293036 master-0 kubenswrapper[7320]: I0312 12:22:23.292413 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4m9jh"] Mar 12 12:22:23.293036 master-0 kubenswrapper[7320]: I0312 12:22:23.292492 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr"] Mar 12 12:22:23.314791 master-0 kubenswrapper[7320]: I0312 12:22:23.312297 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-var-lock\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.314791 master-0 kubenswrapper[7320]: I0312 12:22:23.312343 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.314791 master-0 kubenswrapper[7320]: I0312 12:22:23.312382 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.314791 master-0 kubenswrapper[7320]: W0312 12:22:23.313292 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2269d7_f11f_46d1_95e7_f89a70ee1152.slice/crio-2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86 WatchSource:0}: Error finding container 2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86: Status 404 returned error can't find the container with id 2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86 Mar 12 12:22:23.315709 master-0 kubenswrapper[7320]: I0312 12:22:23.315670 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9"] Mar 12 12:22:23.331637 master-0 kubenswrapper[7320]: W0312 12:22:23.331575 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bc7dea3_1868_488c_a34b_288cde3acd35.slice/crio-a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89 WatchSource:0}: Error finding container a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89: Status 404 returned error can't find the container with id a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89 Mar 12 12:22:23.414647 master-0 kubenswrapper[7320]: I0312 12:22:23.414536 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-var-lock\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.414647 master-0 kubenswrapper[7320]: I0312 12:22:23.414588 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.414647 master-0 kubenswrapper[7320]: I0312 12:22:23.414615 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.415200 master-0 kubenswrapper[7320]: I0312 12:22:23.414698 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.415200 master-0 kubenswrapper[7320]: I0312 12:22:23.414734 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-var-lock\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.429645 master-0 kubenswrapper[7320]: I0312 12:22:23.429557 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.521944 master-0 kubenswrapper[7320]: I0312 12:22:23.521837 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-rgstx"] Mar 12 12:22:23.533427 master-0 kubenswrapper[7320]: W0312 12:22:23.532707 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c02552c_a477_4c6c_8a45_2fdc758c084b.slice/crio-70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7 WatchSource:0}: Error finding container 70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7: Status 404 returned error can't find the container with id 70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7 Mar 12 12:22:23.535286 master-0 kubenswrapper[7320]: I0312 12:22:23.534608 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85"] Mar 12 12:22:23.546359 master-0 kubenswrapper[7320]: I0312 12:22:23.546323 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:22:23.777691 master-0 kubenswrapper[7320]: I0312 12:22:23.777579 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" event={"ID":"3c02552c-a477-4c6c-8a45-2fdc758c084b","Type":"ContainerStarted","Data":"70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7"} Mar 12 12:22:23.778208 master-0 kubenswrapper[7320]: I0312 12:22:23.778183 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" event={"ID":"74d06933-afab-43a3-a1d3-88a569178d34","Type":"ContainerStarted","Data":"91dfd404945dfa72b72f5ee55329cfd84189e41ee3e0a4c889c8d9f86c69e940"} Mar 12 12:22:23.779257 master-0 kubenswrapper[7320]: I0312 12:22:23.779228 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4m9jh" event={"ID":"e64bc838-280e-4231-9732-1adb69fed0bc","Type":"ContainerStarted","Data":"7e3651e4363549a04b82ec26394498f6d4ec7456d1ae54da099d3f3dc779acb3"} Mar 12 12:22:23.780247 master-0 kubenswrapper[7320]: I0312 12:22:23.780220 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" event={"ID":"d961a5f0-84b7-47d7-846b-238475947121","Type":"ContainerStarted","Data":"bfec511e21bbf0dfd869b19d5fa48c209028875c1e502c22cc890b2268398c69"} Mar 12 12:22:23.781750 master-0 kubenswrapper[7320]: I0312 12:22:23.781362 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" event={"ID":"ae2269d7-f11f-46d1-95e7-f89a70ee1152","Type":"ContainerStarted","Data":"2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86"} Mar 12 12:22:23.783975 master-0 kubenswrapper[7320]: I0312 12:22:23.783945 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" event={"ID":"d1d16bbc-778b-4fc1-abb2-b43e79a7c532","Type":"ContainerStarted","Data":"9439da09bf23b4914a7963b9cc80bd159a0b65a9b74f2475f14ac540feb3ad07"} Mar 12 12:22:23.784035 master-0 kubenswrapper[7320]: I0312 12:22:23.783990 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" event={"ID":"d1d16bbc-778b-4fc1-abb2-b43e79a7c532","Type":"ContainerStarted","Data":"d7aae39ac0f8cea9552f3bc1f796a0a6a68481f6079d7651593a9da0b4c18f5a"} Mar 12 12:22:23.784855 master-0 kubenswrapper[7320]: I0312 12:22:23.784833 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" event={"ID":"9bc7dea3-1868-488c-a34b-288cde3acd35","Type":"ContainerStarted","Data":"a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89"} Mar 12 12:22:23.995518 master-0 kubenswrapper[7320]: I0312 12:22:23.995318 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 12 12:22:24.007500 master-0 kubenswrapper[7320]: W0312 12:22:24.007445 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb7aa62dd_2de4_4511_a7e7_27f45fe97cc1.slice/crio-06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7 WatchSource:0}: Error finding container 06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7: Status 404 returned error can't find the container with id 06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7 Mar 12 12:22:24.793900 master-0 kubenswrapper[7320]: I0312 12:22:24.793360 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1","Type":"ContainerStarted","Data":"1d7fe59666c56c328d192c878aec970270ae5b794f380892a172dc12ef6839ec"} Mar 12 12:22:24.793900 master-0 kubenswrapper[7320]: I0312 12:22:24.793439 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1","Type":"ContainerStarted","Data":"06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7"} Mar 12 12:22:26.937433 master-0 kubenswrapper[7320]: I0312 12:22:26.937374 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=3.9373586449999998 podStartE2EDuration="3.937358645s" podCreationTimestamp="2026-03-12 12:22:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:22:24.919808462 +0000 UTC m=+67.478852343" watchObservedRunningTime="2026-03-12 12:22:26.937358645 +0000 UTC m=+69.496402526" Mar 12 12:22:26.941626 master-0 kubenswrapper[7320]: I0312 12:22:26.937978 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 12 12:22:26.941626 master-0 kubenswrapper[7320]: I0312 12:22:26.938448 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:26.943026 master-0 kubenswrapper[7320]: I0312 12:22:26.942992 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 12:22:26.943026 master-0 kubenswrapper[7320]: I0312 12:22:26.943002 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-bkl67" Mar 12 12:22:27.075576 master-0 kubenswrapper[7320]: I0312 12:22:27.075513 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.075576 master-0 kubenswrapper[7320]: I0312 12:22:27.075569 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.075819 master-0 kubenswrapper[7320]: I0312 12:22:27.075600 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.176644 master-0 kubenswrapper[7320]: I0312 12:22:27.176595 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.176817 master-0 kubenswrapper[7320]: I0312 12:22:27.176666 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.176817 master-0 kubenswrapper[7320]: I0312 12:22:27.176743 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.177065 master-0 kubenswrapper[7320]: I0312 12:22:27.177039 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.177140 master-0 kubenswrapper[7320]: I0312 12:22:27.177123 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:27.710308 master-0 kubenswrapper[7320]: I0312 12:22:27.710241 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 12 12:22:28.227409 master-0 kubenswrapper[7320]: I0312 12:22:28.223636 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 12 12:22:28.227409 master-0 kubenswrapper[7320]: I0312 12:22:28.223831 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="a381a5c1-050f-496f-b13f-eefa7310557f" containerName="installer" containerID="cri-o://57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4" gracePeriod=30 Mar 12 12:22:28.275628 master-0 kubenswrapper[7320]: I0312 12:22:28.275570 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:28.456033 master-0 kubenswrapper[7320]: I0312 12:22:28.455976 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:22:31.046138 master-0 kubenswrapper[7320]: I0312 12:22:31.046066 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 12 12:22:31.047040 master-0 kubenswrapper[7320]: I0312 12:22:31.046821 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.050106 master-0 kubenswrapper[7320]: I0312 12:22:31.049937 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-rwh96" Mar 12 12:22:31.058254 master-0 kubenswrapper[7320]: I0312 12:22:31.058210 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 12 12:22:31.132123 master-0 kubenswrapper[7320]: I0312 12:22:31.131913 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.132123 master-0 kubenswrapper[7320]: I0312 12:22:31.131962 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.132123 master-0 kubenswrapper[7320]: I0312 12:22:31.132020 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.173948 master-0 kubenswrapper[7320]: I0312 12:22:31.173806 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4"] Mar 12 12:22:31.174770 master-0 kubenswrapper[7320]: I0312 12:22:31.174730 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.177510 master-0 kubenswrapper[7320]: I0312 12:22:31.177303 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 12:22:31.178070 master-0 kubenswrapper[7320]: I0312 12:22:31.177983 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-spkrx" Mar 12 12:22:31.178070 master-0 kubenswrapper[7320]: I0312 12:22:31.178018 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 12:22:31.178230 master-0 kubenswrapper[7320]: I0312 12:22:31.178153 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 12:22:31.188391 master-0 kubenswrapper[7320]: I0312 12:22:31.188328 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4"] Mar 12 12:22:31.233442 master-0 kubenswrapper[7320]: I0312 12:22:31.233375 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.233442 master-0 kubenswrapper[7320]: I0312 12:22:31.233433 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.233716 master-0 kubenswrapper[7320]: I0312 12:22:31.233520 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.233716 master-0 kubenswrapper[7320]: I0312 12:22:31.233551 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.233716 master-0 kubenswrapper[7320]: I0312 12:22:31.233628 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.250113 master-0 kubenswrapper[7320]: I0312 12:22:31.250076 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.334816 master-0 kubenswrapper[7320]: I0312 12:22:31.334712 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdfnh\" (UniqueName: \"kubernetes.io/projected/c62edaec-38e2-4b73-8bb5-c776abfb310f-kube-api-access-cdfnh\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.334816 master-0 kubenswrapper[7320]: I0312 12:22:31.334763 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c62edaec-38e2-4b73-8bb5-c776abfb310f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.368317 master-0 kubenswrapper[7320]: I0312 12:22:31.368269 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:22:31.437055 master-0 kubenswrapper[7320]: I0312 12:22:31.436807 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdfnh\" (UniqueName: \"kubernetes.io/projected/c62edaec-38e2-4b73-8bb5-c776abfb310f-kube-api-access-cdfnh\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.437055 master-0 kubenswrapper[7320]: I0312 12:22:31.436887 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c62edaec-38e2-4b73-8bb5-c776abfb310f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.443309 master-0 kubenswrapper[7320]: I0312 12:22:31.443268 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c62edaec-38e2-4b73-8bb5-c776abfb310f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.456850 master-0 kubenswrapper[7320]: I0312 12:22:31.456797 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdfnh\" (UniqueName: \"kubernetes.io/projected/c62edaec-38e2-4b73-8bb5-c776abfb310f-kube-api-access-cdfnh\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:31.497685 master-0 kubenswrapper[7320]: I0312 12:22:31.497630 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:22:32.829403 master-0 kubenswrapper[7320]: I0312 12:22:32.829325 7320 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 12 12:22:32.829841 master-0 kubenswrapper[7320]: I0312 12:22:32.829567 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6" gracePeriod=30 Mar 12 12:22:32.829841 master-0 kubenswrapper[7320]: I0312 12:22:32.829690 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714" gracePeriod=30 Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: I0312 12:22:32.832827 7320 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: E0312 12:22:32.832997 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: I0312 12:22:32.833008 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: E0312 12:22:32.833018 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: I0312 12:22:32.833023 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: I0312 12:22:32.833110 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 12 12:22:32.833691 master-0 kubenswrapper[7320]: I0312 12:22:32.833129 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 12 12:22:32.839344 master-0 kubenswrapper[7320]: I0312 12:22:32.838567 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 12 12:22:32.957926 master-0 kubenswrapper[7320]: I0312 12:22:32.957862 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:32.957926 master-0 kubenswrapper[7320]: I0312 12:22:32.957908 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:32.957926 master-0 kubenswrapper[7320]: I0312 12:22:32.957932 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:32.958258 master-0 kubenswrapper[7320]: I0312 12:22:32.957949 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:32.958258 master-0 kubenswrapper[7320]: I0312 12:22:32.957968 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:32.958258 master-0 kubenswrapper[7320]: I0312 12:22:32.958024 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.059801 master-0 kubenswrapper[7320]: I0312 12:22:33.059704 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.059801 master-0 kubenswrapper[7320]: I0312 12:22:33.059758 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060041 master-0 kubenswrapper[7320]: I0312 12:22:33.059850 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060041 master-0 kubenswrapper[7320]: I0312 12:22:33.059916 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060118 master-0 kubenswrapper[7320]: I0312 12:22:33.060063 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060118 master-0 kubenswrapper[7320]: I0312 12:22:33.060089 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060118 master-0 kubenswrapper[7320]: I0312 12:22:33.060104 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060206 master-0 kubenswrapper[7320]: I0312 12:22:33.060153 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060206 master-0 kubenswrapper[7320]: I0312 12:22:33.060175 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060206 master-0 kubenswrapper[7320]: I0312 12:22:33.060197 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060293 master-0 kubenswrapper[7320]: I0312 12:22:33.060217 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.060293 master-0 kubenswrapper[7320]: I0312 12:22:33.060240 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:22:33.842492 master-0 kubenswrapper[7320]: I0312 12:22:33.842209 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" event={"ID":"3c02552c-a477-4c6c-8a45-2fdc758c084b","Type":"ContainerStarted","Data":"26e67be1a2a9c4fa709a8c73f483e4f52cb5856a4a6c41d3c7dc95bd7856e253"} Mar 12 12:22:33.842492 master-0 kubenswrapper[7320]: I0312 12:22:33.842265 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:22:33.843834 master-0 kubenswrapper[7320]: I0312 12:22:33.843765 7320 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-rgstx container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.7:8080/healthz\": dial tcp 10.128.0.7:8080: connect: connection refused" start-of-body= Mar 12 12:22:33.843911 master-0 kubenswrapper[7320]: I0312 12:22:33.843812 7320 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" podUID="3c02552c-a477-4c6c-8a45-2fdc758c084b" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.7:8080/healthz\": dial tcp 10.128.0.7:8080: connect: connection refused" Mar 12 12:22:33.845429 master-0 kubenswrapper[7320]: I0312 12:22:33.844739 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" event={"ID":"ae2269d7-f11f-46d1-95e7-f89a70ee1152","Type":"ContainerStarted","Data":"46f5319563a827ff86f993ac168c378b59679d2793315edaeb0c316bdda94906"} Mar 12 12:22:34.851737 master-0 kubenswrapper[7320]: I0312 12:22:34.851666 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4m9jh" event={"ID":"e64bc838-280e-4231-9732-1adb69fed0bc","Type":"ContainerStarted","Data":"500c7ffa5e75ec0a24c1079e2b11498f198ab95eb000bc429ec6185fae87557d"} Mar 12 12:22:34.851737 master-0 kubenswrapper[7320]: I0312 12:22:34.851737 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4m9jh" event={"ID":"e64bc838-280e-4231-9732-1adb69fed0bc","Type":"ContainerStarted","Data":"721dc96523301745601044aab5e67f810ef6e18f0f55e991c357295ed3dbb94b"} Mar 12 12:22:34.852980 master-0 kubenswrapper[7320]: I0312 12:22:34.852920 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" event={"ID":"d961a5f0-84b7-47d7-846b-238475947121","Type":"ContainerStarted","Data":"500e92ed2e084af2e7e87754fde25e08ea680d95e255a969f1e5b249d9b80765"} Mar 12 12:22:34.853200 master-0 kubenswrapper[7320]: I0312 12:22:34.853168 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:22:34.856776 master-0 kubenswrapper[7320]: I0312 12:22:34.856748 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" event={"ID":"d1d16bbc-778b-4fc1-abb2-b43e79a7c532","Type":"ContainerStarted","Data":"392bacabbad99777b25b231313d0a57eab1ab274431a1195e628095a52c28f22"} Mar 12 12:22:34.857085 master-0 kubenswrapper[7320]: I0312 12:22:34.857013 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:22:34.857629 master-0 kubenswrapper[7320]: I0312 12:22:34.857519 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:22:34.858788 master-0 kubenswrapper[7320]: I0312 12:22:34.858755 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" event={"ID":"9bc7dea3-1868-488c-a34b-288cde3acd35","Type":"ContainerStarted","Data":"66ac5cb5ac01ba22dc5debb8648c5e910f8e7732cb1f9e6097ebc2965cc2ccbc"} Mar 12 12:22:34.859962 master-0 kubenswrapper[7320]: I0312 12:22:34.859927 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:22:34.862375 master-0 kubenswrapper[7320]: I0312 12:22:34.862332 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" event={"ID":"74d06933-afab-43a3-a1d3-88a569178d34","Type":"ContainerStarted","Data":"8cf7a7986d7461740b2a22c40e2265aaa2c172b0bc619290d4b6ae5354a29eb5"} Mar 12 12:22:34.862615 master-0 kubenswrapper[7320]: I0312 12:22:34.862591 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" event={"ID":"74d06933-afab-43a3-a1d3-88a569178d34","Type":"ContainerStarted","Data":"e8a0643d2fccb7a3a8b1a9999fe3ff756f073c2a63121cc490f340f51eaf7001"} Mar 12 12:22:34.866199 master-0 kubenswrapper[7320]: I0312 12:22:34.866144 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:22:34.866673 master-0 kubenswrapper[7320]: I0312 12:22:34.866635 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:22:44.740276 master-0 kubenswrapper[7320]: I0312 12:22:44.740201 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a381a5c1-050f-496f-b13f-eefa7310557f/installer/0.log" Mar 12 12:22:44.741130 master-0 kubenswrapper[7320]: I0312 12:22:44.740320 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:44.856186 master-0 kubenswrapper[7320]: I0312 12:22:44.856153 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-var-lock\") pod \"a381a5c1-050f-496f-b13f-eefa7310557f\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " Mar 12 12:22:44.856430 master-0 kubenswrapper[7320]: I0312 12:22:44.856287 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-var-lock" (OuterVolumeSpecName: "var-lock") pod "a381a5c1-050f-496f-b13f-eefa7310557f" (UID: "a381a5c1-050f-496f-b13f-eefa7310557f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:44.856491 master-0 kubenswrapper[7320]: I0312 12:22:44.856399 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a381a5c1-050f-496f-b13f-eefa7310557f-kube-api-access\") pod \"a381a5c1-050f-496f-b13f-eefa7310557f\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " Mar 12 12:22:44.856581 master-0 kubenswrapper[7320]: I0312 12:22:44.856558 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-kubelet-dir\") pod \"a381a5c1-050f-496f-b13f-eefa7310557f\" (UID: \"a381a5c1-050f-496f-b13f-eefa7310557f\") " Mar 12 12:22:44.856773 master-0 kubenswrapper[7320]: I0312 12:22:44.856718 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a381a5c1-050f-496f-b13f-eefa7310557f" (UID: "a381a5c1-050f-496f-b13f-eefa7310557f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:44.857104 master-0 kubenswrapper[7320]: I0312 12:22:44.857071 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:44.857104 master-0 kubenswrapper[7320]: I0312 12:22:44.857100 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a381a5c1-050f-496f-b13f-eefa7310557f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:44.860551 master-0 kubenswrapper[7320]: I0312 12:22:44.860526 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a381a5c1-050f-496f-b13f-eefa7310557f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a381a5c1-050f-496f-b13f-eefa7310557f" (UID: "a381a5c1-050f-496f-b13f-eefa7310557f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:44.918740 master-0 kubenswrapper[7320]: I0312 12:22:44.918681 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_a381a5c1-050f-496f-b13f-eefa7310557f/installer/0.log" Mar 12 12:22:44.918947 master-0 kubenswrapper[7320]: I0312 12:22:44.918754 7320 generic.go:334] "Generic (PLEG): container finished" podID="a381a5c1-050f-496f-b13f-eefa7310557f" containerID="57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4" exitCode=1 Mar 12 12:22:44.918947 master-0 kubenswrapper[7320]: I0312 12:22:44.918797 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a381a5c1-050f-496f-b13f-eefa7310557f","Type":"ContainerDied","Data":"57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4"} Mar 12 12:22:44.918947 master-0 kubenswrapper[7320]: I0312 12:22:44.918837 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"a381a5c1-050f-496f-b13f-eefa7310557f","Type":"ContainerDied","Data":"e52f1e71eae22523cdfa825351c7af19e24ac0cd6dee50cd3b0de236bbadccfe"} Mar 12 12:22:44.918947 master-0 kubenswrapper[7320]: I0312 12:22:44.918864 7320 scope.go:117] "RemoveContainer" containerID="57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4" Mar 12 12:22:44.919239 master-0 kubenswrapper[7320]: I0312 12:22:44.919015 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 12 12:22:44.943333 master-0 kubenswrapper[7320]: I0312 12:22:44.943293 7320 scope.go:117] "RemoveContainer" containerID="57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4" Mar 12 12:22:44.943828 master-0 kubenswrapper[7320]: E0312 12:22:44.943783 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4\": container with ID starting with 57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4 not found: ID does not exist" containerID="57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4" Mar 12 12:22:44.944052 master-0 kubenswrapper[7320]: I0312 12:22:44.944007 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4"} err="failed to get container status \"57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4\": rpc error: code = NotFound desc = could not find container \"57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4\": container with ID starting with 57b44757e99b16cf55064b78a5d1da36dbc02d5322543c7084ba07766672b7b4 not found: ID does not exist" Mar 12 12:22:44.958085 master-0 kubenswrapper[7320]: I0312 12:22:44.958001 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a381a5c1-050f-496f-b13f-eefa7310557f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:45.879445 master-0 kubenswrapper[7320]: E0312 12:22:45.879378 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 12 12:22:45.880306 master-0 kubenswrapper[7320]: I0312 12:22:45.879873 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 12 12:22:45.911604 master-0 kubenswrapper[7320]: W0312 12:22:45.911512 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8 WatchSource:0}: Error finding container 4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8: Status 404 returned error can't find the container with id 4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8 Mar 12 12:22:45.925325 master-0 kubenswrapper[7320]: I0312 12:22:45.925243 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8"} Mar 12 12:22:46.936624 master-0 kubenswrapper[7320]: I0312 12:22:46.936522 7320 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="c768108f96dad44b9e3bcf8d0d6db5eb9d2c1ac1b865d93bd4ff9f67e7bb635a" exitCode=0 Mar 12 12:22:46.937568 master-0 kubenswrapper[7320]: I0312 12:22:46.936640 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"c768108f96dad44b9e3bcf8d0d6db5eb9d2c1ac1b865d93bd4ff9f67e7bb635a"} Mar 12 12:22:46.945231 master-0 kubenswrapper[7320]: I0312 12:22:46.945140 7320 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="7832f5ffa2c5ed0b1534228e4894dd1d7b32cd0726e9bdedd6ffb73456947fa0" exitCode=1 Mar 12 12:22:46.945231 master-0 kubenswrapper[7320]: I0312 12:22:46.945223 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"7832f5ffa2c5ed0b1534228e4894dd1d7b32cd0726e9bdedd6ffb73456947fa0"} Mar 12 12:22:46.946072 master-0 kubenswrapper[7320]: I0312 12:22:46.946025 7320 scope.go:117] "RemoveContainer" containerID="7832f5ffa2c5ed0b1534228e4894dd1d7b32cd0726e9bdedd6ffb73456947fa0" Mar 12 12:22:47.953937 master-0 kubenswrapper[7320]: I0312 12:22:47.953890 7320 generic.go:334] "Generic (PLEG): container finished" podID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerID="c6ebc8cdb1ea535bf07be2b08b9bb0f2c20d1bfada4f7f7c593c77044d94f79a" exitCode=0 Mar 12 12:22:47.954460 master-0 kubenswrapper[7320]: I0312 12:22:47.953971 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1","Type":"ContainerDied","Data":"c6ebc8cdb1ea535bf07be2b08b9bb0f2c20d1bfada4f7f7c593c77044d94f79a"} Mar 12 12:22:47.956968 master-0 kubenswrapper[7320]: I0312 12:22:47.956933 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3"} Mar 12 12:22:48.124156 master-0 kubenswrapper[7320]: I0312 12:22:48.124103 7320 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-98xjv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 12 12:22:48.124156 master-0 kubenswrapper[7320]: I0312 12:22:48.124154 7320 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" podUID="a346ac54-02fe-417f-a49d-038e45b13a1d" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 12 12:22:48.184794 master-0 kubenswrapper[7320]: I0312 12:22:48.184719 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:22:49.273465 master-0 kubenswrapper[7320]: I0312 12:22:49.273365 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 12 12:22:49.416136 master-0 kubenswrapper[7320]: I0312 12:22:49.415848 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kube-api-access\") pod \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " Mar 12 12:22:49.416136 master-0 kubenswrapper[7320]: I0312 12:22:49.415956 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-var-lock\") pod \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " Mar 12 12:22:49.416136 master-0 kubenswrapper[7320]: I0312 12:22:49.416046 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kubelet-dir\") pod \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\" (UID: \"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1\") " Mar 12 12:22:49.416686 master-0 kubenswrapper[7320]: I0312 12:22:49.416154 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-var-lock" (OuterVolumeSpecName: "var-lock") pod "78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" (UID: "78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:49.416686 master-0 kubenswrapper[7320]: I0312 12:22:49.416249 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" (UID: "78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:49.416952 master-0 kubenswrapper[7320]: I0312 12:22:49.416895 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:49.416952 master-0 kubenswrapper[7320]: I0312 12:22:49.416944 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:49.420626 master-0 kubenswrapper[7320]: I0312 12:22:49.420540 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" (UID: "78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:49.518688 master-0 kubenswrapper[7320]: I0312 12:22:49.518550 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:49.535705 master-0 kubenswrapper[7320]: E0312 12:22:49.535380 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:22:49.970851 master-0 kubenswrapper[7320]: I0312 12:22:49.970786 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1","Type":"ContainerDied","Data":"dd601e94b2e497fb8ecd7edce09ab1fd613fe37dc7cd5a8b1f710f61827b3468"} Mar 12 12:22:49.970851 master-0 kubenswrapper[7320]: I0312 12:22:49.970836 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd601e94b2e497fb8ecd7edce09ab1fd613fe37dc7cd5a8b1f710f61827b3468" Mar 12 12:22:49.970851 master-0 kubenswrapper[7320]: I0312 12:22:49.970841 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 12 12:22:49.973223 master-0 kubenswrapper[7320]: I0312 12:22:49.973137 7320 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d" exitCode=1 Mar 12 12:22:49.973223 master-0 kubenswrapper[7320]: I0312 12:22:49.973204 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d"} Mar 12 12:22:49.973755 master-0 kubenswrapper[7320]: I0312 12:22:49.973723 7320 scope.go:117] "RemoveContainer" containerID="f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d" Mar 12 12:22:50.430605 master-0 kubenswrapper[7320]: E0312 12:22:50.430518 7320 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:22:40Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:22:40Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:22:40Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:22:40Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded" Mar 12 12:22:50.980824 master-0 kubenswrapper[7320]: I0312 12:22:50.980743 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"727b627c806117dd5f2141d70aae9b4f04fa57747cadae9611a9e80d6ca1b04b"} Mar 12 12:22:50.982595 master-0 kubenswrapper[7320]: I0312 12:22:50.982554 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_a59a6bb7-f966-4208-ba85-452095404891/installer/0.log" Mar 12 12:22:50.982712 master-0 kubenswrapper[7320]: I0312 12:22:50.982600 7320 generic.go:334] "Generic (PLEG): container finished" podID="a59a6bb7-f966-4208-ba85-452095404891" containerID="b0b0a71bb15ee38a2037cc0d67a425037c9a862e431396ce17c0501ae76f6aae" exitCode=1 Mar 12 12:22:50.982712 master-0 kubenswrapper[7320]: I0312 12:22:50.982625 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"a59a6bb7-f966-4208-ba85-452095404891","Type":"ContainerDied","Data":"b0b0a71bb15ee38a2037cc0d67a425037c9a862e431396ce17c0501ae76f6aae"} Mar 12 12:22:50.982712 master-0 kubenswrapper[7320]: I0312 12:22:50.982644 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"a59a6bb7-f966-4208-ba85-452095404891","Type":"ContainerDied","Data":"d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8"} Mar 12 12:22:50.982712 master-0 kubenswrapper[7320]: I0312 12:22:50.982658 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8" Mar 12 12:22:51.024548 master-0 kubenswrapper[7320]: I0312 12:22:51.024497 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_a59a6bb7-f966-4208-ba85-452095404891/installer/0.log" Mar 12 12:22:51.024548 master-0 kubenswrapper[7320]: I0312 12:22:51.024558 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:51.147200 master-0 kubenswrapper[7320]: I0312 12:22:51.147061 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-var-lock\") pod \"a59a6bb7-f966-4208-ba85-452095404891\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " Mar 12 12:22:51.147200 master-0 kubenswrapper[7320]: I0312 12:22:51.147142 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-kubelet-dir\") pod \"a59a6bb7-f966-4208-ba85-452095404891\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " Mar 12 12:22:51.147403 master-0 kubenswrapper[7320]: I0312 12:22:51.147243 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-var-lock" (OuterVolumeSpecName: "var-lock") pod "a59a6bb7-f966-4208-ba85-452095404891" (UID: "a59a6bb7-f966-4208-ba85-452095404891"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:51.147403 master-0 kubenswrapper[7320]: I0312 12:22:51.147306 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a59a6bb7-f966-4208-ba85-452095404891-kube-api-access\") pod \"a59a6bb7-f966-4208-ba85-452095404891\" (UID: \"a59a6bb7-f966-4208-ba85-452095404891\") " Mar 12 12:22:51.147403 master-0 kubenswrapper[7320]: I0312 12:22:51.147385 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a59a6bb7-f966-4208-ba85-452095404891" (UID: "a59a6bb7-f966-4208-ba85-452095404891"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:22:51.147647 master-0 kubenswrapper[7320]: I0312 12:22:51.147612 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:51.147703 master-0 kubenswrapper[7320]: I0312 12:22:51.147645 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a59a6bb7-f966-4208-ba85-452095404891-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:51.150648 master-0 kubenswrapper[7320]: I0312 12:22:51.150588 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a59a6bb7-f966-4208-ba85-452095404891-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a59a6bb7-f966-4208-ba85-452095404891" (UID: "a59a6bb7-f966-4208-ba85-452095404891"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:22:51.185857 master-0 kubenswrapper[7320]: I0312 12:22:51.185720 7320 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:22:51.248960 master-0 kubenswrapper[7320]: I0312 12:22:51.248849 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a59a6bb7-f966-4208-ba85-452095404891-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:22:51.416915 master-0 kubenswrapper[7320]: I0312 12:22:51.416746 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:22:51.988043 master-0 kubenswrapper[7320]: I0312 12:22:51.987978 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 12 12:22:55.020201 master-0 kubenswrapper[7320]: I0312 12:22:55.020113 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-cg7rd_b5890f0c-cebe-4788-89f7-27568d875741/openshift-controller-manager-operator/0.log" Mar 12 12:22:55.020201 master-0 kubenswrapper[7320]: I0312 12:22:55.020194 7320 generic.go:334] "Generic (PLEG): container finished" podID="b5890f0c-cebe-4788-89f7-27568d875741" containerID="54c0c581483d3deef8c82f62f09c7eb9259f3f30693873246ec5154f0dcb5178" exitCode=1 Mar 12 12:22:55.021375 master-0 kubenswrapper[7320]: I0312 12:22:55.020238 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" event={"ID":"b5890f0c-cebe-4788-89f7-27568d875741","Type":"ContainerDied","Data":"54c0c581483d3deef8c82f62f09c7eb9259f3f30693873246ec5154f0dcb5178"} Mar 12 12:22:55.021375 master-0 kubenswrapper[7320]: I0312 12:22:55.020902 7320 scope.go:117] "RemoveContainer" containerID="54c0c581483d3deef8c82f62f09c7eb9259f3f30693873246ec5154f0dcb5178" Mar 12 12:22:56.028085 master-0 kubenswrapper[7320]: I0312 12:22:56.028003 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-cg7rd_b5890f0c-cebe-4788-89f7-27568d875741/openshift-controller-manager-operator/0.log" Mar 12 12:22:56.028085 master-0 kubenswrapper[7320]: I0312 12:22:56.028072 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" event={"ID":"b5890f0c-cebe-4788-89f7-27568d875741","Type":"ContainerStarted","Data":"cb0a782fc63b924a99ff6d328a51e27d6d8de3607d733d3c0dc029569e79ee11"} Mar 12 12:22:58.124858 master-0 kubenswrapper[7320]: I0312 12:22:58.124777 7320 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-98xjv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 12 12:22:58.124858 master-0 kubenswrapper[7320]: I0312 12:22:58.124838 7320 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" podUID="a346ac54-02fe-417f-a49d-038e45b13a1d" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 12 12:22:59.536414 master-0 kubenswrapper[7320]: E0312 12:22:59.536334 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:22:59.943367 master-0 kubenswrapper[7320]: E0312 12:22:59.943281 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:00.049239 master-0 kubenswrapper[7320]: I0312 12:23:00.049178 7320 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714" exitCode=0 Mar 12 12:23:00.431081 master-0 kubenswrapper[7320]: E0312 12:23:00.430857 7320 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:01.060154 master-0 kubenswrapper[7320]: I0312 12:23:01.060071 7320 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="17df0049e355b3a960768281cd9fb4fe90537eac08f31c82188b349d802deef8" exitCode=0 Mar 12 12:23:01.060154 master-0 kubenswrapper[7320]: I0312 12:23:01.060137 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"17df0049e355b3a960768281cd9fb4fe90537eac08f31c82188b349d802deef8"} Mar 12 12:23:01.184716 master-0 kubenswrapper[7320]: I0312 12:23:01.184612 7320 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:02.961944 master-0 kubenswrapper[7320]: I0312 12:23:02.961875 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 12 12:23:02.962647 master-0 kubenswrapper[7320]: I0312 12:23:02.962011 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:23:03.073536 master-0 kubenswrapper[7320]: I0312 12:23:03.073435 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 12 12:23:03.073536 master-0 kubenswrapper[7320]: I0312 12:23:03.073521 7320 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6" exitCode=137 Mar 12 12:23:03.073867 master-0 kubenswrapper[7320]: I0312 12:23:03.073576 7320 scope.go:117] "RemoveContainer" containerID="75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714" Mar 12 12:23:03.073867 master-0 kubenswrapper[7320]: I0312 12:23:03.073721 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:23:03.088109 master-0 kubenswrapper[7320]: I0312 12:23:03.088063 7320 scope.go:117] "RemoveContainer" containerID="e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6" Mar 12 12:23:03.106898 master-0 kubenswrapper[7320]: I0312 12:23:03.106832 7320 scope.go:117] "RemoveContainer" containerID="75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714" Mar 12 12:23:03.107459 master-0 kubenswrapper[7320]: E0312 12:23:03.107412 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714\": container with ID starting with 75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714 not found: ID does not exist" containerID="75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714" Mar 12 12:23:03.107666 master-0 kubenswrapper[7320]: I0312 12:23:03.107469 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714"} err="failed to get container status \"75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714\": rpc error: code = NotFound desc = could not find container \"75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714\": container with ID starting with 75daf3045717838ac209cd480f9a2d037d9ba8f6a947c428ef3b5b5ac58ef714 not found: ID does not exist" Mar 12 12:23:03.107666 master-0 kubenswrapper[7320]: I0312 12:23:03.107520 7320 scope.go:117] "RemoveContainer" containerID="e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6" Mar 12 12:23:03.108117 master-0 kubenswrapper[7320]: E0312 12:23:03.108062 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6\": container with ID starting with e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6 not found: ID does not exist" containerID="e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6" Mar 12 12:23:03.108359 master-0 kubenswrapper[7320]: I0312 12:23:03.108297 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6"} err="failed to get container status \"e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6\": rpc error: code = NotFound desc = could not find container \"e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6\": container with ID starting with e45349fe3702f21a7fe88bad4e7dbd9bb8a933bb935302952c53ce763b6a48d6 not found: ID does not exist" Mar 12 12:23:03.111832 master-0 kubenswrapper[7320]: I0312 12:23:03.111782 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 12 12:23:03.111950 master-0 kubenswrapper[7320]: I0312 12:23:03.111884 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 12 12:23:03.112069 master-0 kubenswrapper[7320]: I0312 12:23:03.112015 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:23:03.112167 master-0 kubenswrapper[7320]: I0312 12:23:03.112082 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:23:03.112780 master-0 kubenswrapper[7320]: I0312 12:23:03.112710 7320 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:23:03.112780 master-0 kubenswrapper[7320]: I0312 12:23:03.112769 7320 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:23:03.760395 master-0 kubenswrapper[7320]: I0312 12:23:03.760338 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 12 12:23:03.760691 master-0 kubenswrapper[7320]: I0312 12:23:03.760661 7320 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 12 12:23:06.859683 master-0 kubenswrapper[7320]: E0312 12:23:06.859336 7320 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c1776ed2a4674 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:22:32.829683316 +0000 UTC m=+75.388727197,LastTimestamp:2026-03-12 12:22:32.829683316 +0000 UTC m=+75.388727197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:23:08.124448 master-0 kubenswrapper[7320]: I0312 12:23:08.124332 7320 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-98xjv container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 12 12:23:08.124448 master-0 kubenswrapper[7320]: I0312 12:23:08.124432 7320 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" podUID="a346ac54-02fe-417f-a49d-038e45b13a1d" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 12 12:23:09.537686 master-0 kubenswrapper[7320]: E0312 12:23:09.537605 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:10.432018 master-0 kubenswrapper[7320]: E0312 12:23:10.431931 7320 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 12 12:23:11.126263 master-0 kubenswrapper[7320]: I0312 12:23:11.126199 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-rbb5m_61ab511b-72e9-4fb9-b5de-770f49514369/network-operator/0.log" Mar 12 12:23:11.126874 master-0 kubenswrapper[7320]: I0312 12:23:11.126268 7320 generic.go:334] "Generic (PLEG): container finished" podID="61ab511b-72e9-4fb9-b5de-770f49514369" containerID="630c088f3826f86c1fe389a213d79d0dfdd3c10669dd76b8de7210253f979c04" exitCode=255 Mar 12 12:23:11.185138 master-0 kubenswrapper[7320]: I0312 12:23:11.185052 7320 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:13.143089 master-0 kubenswrapper[7320]: I0312 12:23:13.142926 7320 generic.go:334] "Generic (PLEG): container finished" podID="8a121d0d-d201-446b-97a1-e2414e599f4a" containerID="13d86bfb78c4cbeb08e1a2822a58d3b19158c64d8933a782a2465848cf9de135" exitCode=0 Mar 12 12:23:14.069365 master-0 kubenswrapper[7320]: E0312 12:23:14.069266 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:14.154285 master-0 kubenswrapper[7320]: I0312 12:23:14.154212 7320 generic.go:334] "Generic (PLEG): container finished" podID="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" containerID="8ee43439e03e174fce129de95caef5dbf8392a0dcca1c8da1e1088570ad3efed" exitCode=0 Mar 12 12:23:15.162114 master-0 kubenswrapper[7320]: I0312 12:23:15.162045 7320 generic.go:334] "Generic (PLEG): container finished" podID="55bf535c-93ab-4870-a9d2-c02496d71ef0" containerID="6a3b8be971ca63800f0532603d6fc3d806dc294fa7565d4894a90520eb420540" exitCode=0 Mar 12 12:23:15.164170 master-0 kubenswrapper[7320]: I0312 12:23:15.164112 7320 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="d7e05b92a4dbd9fdd6089f7478db7952000d8da47fb29fa9de9acabcf994c90c" exitCode=0 Mar 12 12:23:19.188156 master-0 kubenswrapper[7320]: I0312 12:23:19.187953 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-rzmhl_51d58450-50bb-4da0-b1f6-4135fbabd856/approver/0.log" Mar 12 12:23:19.188924 master-0 kubenswrapper[7320]: I0312 12:23:19.188563 7320 generic.go:334] "Generic (PLEG): container finished" podID="51d58450-50bb-4da0-b1f6-4135fbabd856" containerID="2c2b5cd50e4b41a7c3aafd02e56e622ce6b2150721ba8e3603b831c988a04475" exitCode=1 Mar 12 12:23:19.191364 master-0 kubenswrapper[7320]: I0312 12:23:19.191298 7320 generic.go:334] "Generic (PLEG): container finished" podID="a346ac54-02fe-417f-a49d-038e45b13a1d" containerID="ebdafa22bf6b7ed28319fcbd34230d3e124233b075083adefec56e18e5a788b3" exitCode=0 Mar 12 12:23:19.539935 master-0 kubenswrapper[7320]: E0312 12:23:19.539792 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:20.434280 master-0 kubenswrapper[7320]: E0312 12:23:20.434017 7320 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:21.206784 master-0 kubenswrapper[7320]: I0312 12:23:21.206699 7320 generic.go:334] "Generic (PLEG): container finished" podID="ea80247e-b4dd-45dc-8255-6e68508c8480" containerID="fc4b4e674883cb3e17ae8f6229f477c4fc095a1d76196bd6eee19ed1d8bb25c9" exitCode=0 Mar 12 12:23:28.172948 master-0 kubenswrapper[7320]: E0312 12:23:28.172867 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:29.254023 master-0 kubenswrapper[7320]: I0312 12:23:29.253931 7320 generic.go:334] "Generic (PLEG): container finished" podID="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" containerID="2ecd7f48de11aae6e5506fd79aa229be8956b481497e7ee996afbf26849c14c9" exitCode=0 Mar 12 12:23:29.541088 master-0 kubenswrapper[7320]: E0312 12:23:29.540964 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:29.541088 master-0 kubenswrapper[7320]: I0312 12:23:29.541082 7320 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 12:23:30.277360 master-0 kubenswrapper[7320]: I0312 12:23:30.277274 7320 generic.go:334] "Generic (PLEG): container finished" podID="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" containerID="78cb426b98a54442332ae7dea069dbb75e6d07a8377b812d61f3d58bf1a33d17" exitCode=0 Mar 12 12:23:30.434529 master-0 kubenswrapper[7320]: E0312 12:23:30.434328 7320 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:23:30.434837 master-0 kubenswrapper[7320]: E0312 12:23:30.434803 7320 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 12:23:33.844665 master-0 kubenswrapper[7320]: I0312 12:23:33.844580 7320 status_manager.go:851] "Failed to get status for pod" podUID="3c02552c-a477-4c6c-8a45-2fdc758c084b" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods marketplace-operator-64bf9778cb-rgstx)" Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: E0312 12:23:34.254075 7320 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_a97fcd56-aa52-414a-b370-154c1b34c1ed_0(f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6" Netns:"/var/run/netns/203870f4-4292-4e1c-ab38-1e93ebab012d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6;K8S_POD_UID=a97fcd56-aa52-414a-b370-154c1b34c1ed" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/a97fcd56-aa52-414a-b370-154c1b34c1ed]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: > Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: E0312 12:23:34.254172 7320 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_a97fcd56-aa52-414a-b370-154c1b34c1ed_0(f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6" Netns:"/var/run/netns/203870f4-4292-4e1c-ab38-1e93ebab012d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6;K8S_POD_UID=a97fcd56-aa52-414a-b370-154c1b34c1ed" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/a97fcd56-aa52-414a-b370-154c1b34c1ed]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: E0312 12:23:34.254197 7320 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_a97fcd56-aa52-414a-b370-154c1b34c1ed_0(f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6" Netns:"/var/run/netns/203870f4-4292-4e1c-ab38-1e93ebab012d" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6;K8S_POD_UID=a97fcd56-aa52-414a-b370-154c1b34c1ed" Path:"" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/a97fcd56-aa52-414a-b370-154c1b34c1ed]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: > pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:23:34.254585 master-0 kubenswrapper[7320]: E0312 12:23:34.254278 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-2-master-0_openshift-kube-controller-manager(a97fcd56-aa52-414a-b370-154c1b34c1ed)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-2-master-0_openshift-kube-controller-manager(a97fcd56-aa52-414a-b370-154c1b34c1ed)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-2-master-0_openshift-kube-controller-manager_a97fcd56-aa52-414a-b370-154c1b34c1ed_0(f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6): error adding pod openshift-kube-controller-manager_installer-2-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6\\\" Netns:\\\"/var/run/netns/203870f4-4292-4e1c-ab38-1e93ebab012d\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-controller-manager;K8S_POD_NAME=installer-2-master-0;K8S_POD_INFRA_CONTAINER_ID=f98838cd35e33a5ced62304a70b0b8d89126e2b036c6b91a1069d0a93f750bc6;K8S_POD_UID=a97fcd56-aa52-414a-b370-154c1b34c1ed\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-controller-manager/installer-2-master-0] networking: Multus: [openshift-kube-controller-manager/installer-2-master-0/a97fcd56-aa52-414a-b370-154c1b34c1ed]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-2-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-2-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-controller-manager/installer-2-master-0" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: E0312 12:23:34.263707 7320 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-apiserver_48e7be9a-921a-42b0-b9ae-b7ffd28c89a4_0(b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261): error adding pod openshift-kube-apiserver_installer-1-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261" Netns:"/var/run/netns/7329d9ee-f49a-4a8c-adc3-5305bd18edc6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-apiserver;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261;K8S_POD_UID=48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" Path:"" ERRORED: error configuring pod [openshift-kube-apiserver/installer-1-master-0] networking: Multus: [openshift-kube-apiserver/installer-1-master-0/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: > Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: E0312 12:23:34.263752 7320 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-apiserver_48e7be9a-921a-42b0-b9ae-b7ffd28c89a4_0(b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261): error adding pod openshift-kube-apiserver_installer-1-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261" Netns:"/var/run/netns/7329d9ee-f49a-4a8c-adc3-5305bd18edc6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-apiserver;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261;K8S_POD_UID=48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" Path:"" ERRORED: error configuring pod [openshift-kube-apiserver/installer-1-master-0] networking: Multus: [openshift-kube-apiserver/installer-1-master-0/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: > pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: E0312 12:23:34.263772 7320 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-apiserver_48e7be9a-921a-42b0-b9ae-b7ffd28c89a4_0(b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261): error adding pod openshift-kube-apiserver_installer-1-master-0 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261" Netns:"/var/run/netns/7329d9ee-f49a-4a8c-adc3-5305bd18edc6" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-apiserver;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261;K8S_POD_UID=48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" Path:"" ERRORED: error configuring pod [openshift-kube-apiserver/installer-1-master-0] networking: Multus: [openshift-kube-apiserver/installer-1-master-0/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: > pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:23:34.263874 master-0 kubenswrapper[7320]: E0312 12:23:34.263817 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"installer-1-master-0_openshift-kube-apiserver(48e7be9a-921a-42b0-b9ae-b7ffd28c89a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"installer-1-master-0_openshift-kube-apiserver(48e7be9a-921a-42b0-b9ae-b7ffd28c89a4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_installer-1-master-0_openshift-kube-apiserver_48e7be9a-921a-42b0-b9ae-b7ffd28c89a4_0(b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261): error adding pod openshift-kube-apiserver_installer-1-master-0 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261\\\" Netns:\\\"/var/run/netns/7329d9ee-f49a-4a8c-adc3-5305bd18edc6\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-kube-apiserver;K8S_POD_NAME=installer-1-master-0;K8S_POD_INFRA_CONTAINER_ID=b159df217923561cdbd535f8fac74c2c4284ba8f86e433abb485a34e9e393261;K8S_POD_UID=48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-kube-apiserver/installer-1-master-0] networking: Multus: [openshift-kube-apiserver/installer-1-master-0/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod installer-1-master-0 in out of cluster comm: SetNetworkStatus: failed to update the pod installer-1-master-0 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-kube-apiserver/installer-1-master-0" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" Mar 12 12:23:34.273885 master-0 kubenswrapper[7320]: E0312 12:23:34.273787 7320 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 12 12:23:34.273885 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dnxx4_openshift-machine-api_c62edaec-38e2-4b73-8bb5-c776abfb310f_0(cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03" Netns:"/var/run/netns/1babf96d-7c0d-46b5-81bc-516e225867ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dnxx4;K8S_POD_INFRA_CONTAINER_ID=cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03;K8S_POD_UID=c62edaec-38e2-4b73-8bb5-c776abfb310f" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4/c62edaec-38e2-4b73-8bb5-c776abfb310f]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dnxx4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.273885 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.273885 master-0 kubenswrapper[7320]: > Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: E0312 12:23:34.273941 7320 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dnxx4_openshift-machine-api_c62edaec-38e2-4b73-8bb5-c776abfb310f_0(cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03" Netns:"/var/run/netns/1babf96d-7c0d-46b5-81bc-516e225867ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dnxx4;K8S_POD_INFRA_CONTAINER_ID=cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03;K8S_POD_UID=c62edaec-38e2-4b73-8bb5-c776abfb310f" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4/c62edaec-38e2-4b73-8bb5-c776abfb310f]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dnxx4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: > pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: E0312 12:23:34.273966 7320 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dnxx4_openshift-machine-api_c62edaec-38e2-4b73-8bb5-c776abfb310f_0(cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4 to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03" Netns:"/var/run/netns/1babf96d-7c0d-46b5-81bc-516e225867ed" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dnxx4;K8S_POD_INFRA_CONTAINER_ID=cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03;K8S_POD_UID=c62edaec-38e2-4b73-8bb5-c776abfb310f" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4/c62edaec-38e2-4b73-8bb5-c776abfb310f]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dnxx4?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: > pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:23:34.274160 master-0 kubenswrapper[7320]: E0312 12:23:34.274044 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"control-plane-machine-set-operator-6686554ddc-dnxx4_openshift-machine-api(c62edaec-38e2-4b73-8bb5-c776abfb310f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"control-plane-machine-set-operator-6686554ddc-dnxx4_openshift-machine-api(c62edaec-38e2-4b73-8bb5-c776abfb310f)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dnxx4_openshift-machine-api_c62edaec-38e2-4b73-8bb5-c776abfb310f_0(cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4 to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03\\\" Netns:\\\"/var/run/netns/1babf96d-7c0d-46b5-81bc-516e225867ed\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dnxx4;K8S_POD_INFRA_CONTAINER_ID=cd4a8ef0411dc82e0652ce459c5fad10885d5eed8c3c10fe9da7a0db7d2deb03;K8S_POD_UID=c62edaec-38e2-4b73-8bb5-c776abfb310f\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4/c62edaec-38e2-4b73-8bb5-c776abfb310f]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dnxx4 in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dnxx4?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" podUID="c62edaec-38e2-4b73-8bb5-c776abfb310f" Mar 12 12:23:34.304238 master-0 kubenswrapper[7320]: I0312 12:23:34.303330 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-sp7w9_9bc7dea3-1868-488c-a34b-288cde3acd35/olm-operator/0.log" Mar 12 12:23:34.304238 master-0 kubenswrapper[7320]: I0312 12:23:34.303392 7320 generic.go:334] "Generic (PLEG): container finished" podID="9bc7dea3-1868-488c-a34b-288cde3acd35" containerID="66ac5cb5ac01ba22dc5debb8648c5e910f8e7732cb1f9e6097ebc2965cc2ccbc" exitCode=1 Mar 12 12:23:34.306283 master-0 kubenswrapper[7320]: I0312 12:23:34.306231 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-nwk7v_d961a5f0-84b7-47d7-846b-238475947121/catalog-operator/0.log" Mar 12 12:23:34.306348 master-0 kubenswrapper[7320]: I0312 12:23:34.306317 7320 generic.go:334] "Generic (PLEG): container finished" podID="d961a5f0-84b7-47d7-846b-238475947121" containerID="500e92ed2e084af2e7e87754fde25e08ea680d95e255a969f1e5b249d9b80765" exitCode=1 Mar 12 12:23:34.306572 master-0 kubenswrapper[7320]: I0312 12:23:34.306427 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:23:34.306572 master-0 kubenswrapper[7320]: I0312 12:23:34.306457 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:23:34.306572 master-0 kubenswrapper[7320]: I0312 12:23:34.306520 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:23:34.307385 master-0 kubenswrapper[7320]: I0312 12:23:34.306975 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:23:34.307385 master-0 kubenswrapper[7320]: I0312 12:23:34.307015 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:23:34.307385 master-0 kubenswrapper[7320]: I0312 12:23:34.307111 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:23:37.763893 master-0 kubenswrapper[7320]: E0312 12:23:37.763833 7320 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 12 12:23:37.764346 master-0 kubenswrapper[7320]: E0312 12:23:37.764130 7320 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.01s" Mar 12 12:23:37.784079 master-0 kubenswrapper[7320]: I0312 12:23:37.783986 7320 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 12 12:23:39.542294 master-0 kubenswrapper[7320]: E0312 12:23:39.542171 7320 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 12 12:23:40.570954 master-0 kubenswrapper[7320]: E0312 12:23:40.570903 7320 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.807s" Mar 12 12:23:40.572286 master-0 kubenswrapper[7320]: I0312 12:23:40.571035 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:23:40.572286 master-0 kubenswrapper[7320]: I0312 12:23:40.571052 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:23:40.572286 master-0 kubenswrapper[7320]: I0312 12:23:40.571061 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" event={"ID":"61ab511b-72e9-4fb9-b5de-770f49514369","Type":"ContainerDied","Data":"630c088f3826f86c1fe389a213d79d0dfdd3c10669dd76b8de7210253f979c04"} Mar 12 12:23:40.572286 master-0 kubenswrapper[7320]: I0312 12:23:40.571082 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" event={"ID":"8a121d0d-d201-446b-97a1-e2414e599f4a","Type":"ContainerDied","Data":"13d86bfb78c4cbeb08e1a2822a58d3b19158c64d8933a782a2465848cf9de135"} Mar 12 12:23:40.573203 master-0 kubenswrapper[7320]: I0312 12:23:40.573148 7320 scope.go:117] "RemoveContainer" containerID="13d86bfb78c4cbeb08e1a2822a58d3b19158c64d8933a782a2465848cf9de135" Mar 12 12:23:40.575275 master-0 kubenswrapper[7320]: I0312 12:23:40.574271 7320 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 12 12:23:40.575275 master-0 kubenswrapper[7320]: I0312 12:23:40.574338 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3" gracePeriod=30 Mar 12 12:23:40.575275 master-0 kubenswrapper[7320]: I0312 12:23:40.574852 7320 scope.go:117] "RemoveContainer" containerID="630c088f3826f86c1fe389a213d79d0dfdd3c10669dd76b8de7210253f979c04" Mar 12 12:23:40.580244 master-0 kubenswrapper[7320]: I0312 12:23:40.580167 7320 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 12 12:23:40.596437 master-0 kubenswrapper[7320]: I0312 12:23:40.595739 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerDied","Data":"8ee43439e03e174fce129de95caef5dbf8392a0dcca1c8da1e1088570ad3efed"} Mar 12 12:23:40.596437 master-0 kubenswrapper[7320]: I0312 12:23:40.595806 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" event={"ID":"55bf535c-93ab-4870-a9d2-c02496d71ef0","Type":"ContainerDied","Data":"6a3b8be971ca63800f0532603d6fc3d806dc294fa7565d4894a90520eb420540"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596684 7320 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596730 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"d7e05b92a4dbd9fdd6089f7478db7952000d8da47fb29fa9de9acabcf994c90c"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596765 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerDied","Data":"2c2b5cd50e4b41a7c3aafd02e56e622ce6b2150721ba8e3603b831c988a04475"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596786 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" event={"ID":"a346ac54-02fe-417f-a49d-038e45b13a1d","Type":"ContainerDied","Data":"ebdafa22bf6b7ed28319fcbd34230d3e124233b075083adefec56e18e5a788b3"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596809 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" event={"ID":"ea80247e-b4dd-45dc-8255-6e68508c8480","Type":"ContainerDied","Data":"fc4b4e674883cb3e17ae8f6229f477c4fc095a1d76196bd6eee19ed1d8bb25c9"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596831 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" event={"ID":"9b960fe2-d59e-4ee1-bd9d-455b46753cb9","Type":"ContainerDied","Data":"2ecd7f48de11aae6e5506fd79aa229be8956b481497e7ee996afbf26849c14c9"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596853 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"49b7eed659093dd9559e33b3a0f638adb7dbba47c5e4ef561aa467ffc52b0eec"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596875 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"f7059097bb2ddbff3e211fcd38dee654f074ce10a4773944219fb7905e3d5723"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596892 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"b2f5073adcb260f7bafd0dbb4eae76a0d78ce200196488dbbf37a087b47f06a5"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596908 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"b0a55edaf166c34b98b5dcf71563b10a7151eb2e9e60290d0d641d4546feecf7"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596924 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"75e3bbf6064ff7f35e82389d300ce882963ffc8541436a1a80d239d3b971f5e4"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596941 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" event={"ID":"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee","Type":"ContainerDied","Data":"78cb426b98a54442332ae7dea069dbb75e6d07a8377b812d61f3d58bf1a33d17"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596965 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" event={"ID":"9bc7dea3-1868-488c-a34b-288cde3acd35","Type":"ContainerDied","Data":"66ac5cb5ac01ba22dc5debb8648c5e910f8e7732cb1f9e6097ebc2965cc2ccbc"} Mar 12 12:23:40.597296 master-0 kubenswrapper[7320]: I0312 12:23:40.596986 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" event={"ID":"d961a5f0-84b7-47d7-846b-238475947121","Type":"ContainerDied","Data":"500e92ed2e084af2e7e87754fde25e08ea680d95e255a969f1e5b249d9b80765"} Mar 12 12:23:40.599429 master-0 kubenswrapper[7320]: I0312 12:23:40.598208 7320 scope.go:117] "RemoveContainer" containerID="8ee43439e03e174fce129de95caef5dbf8392a0dcca1c8da1e1088570ad3efed" Mar 12 12:23:40.599429 master-0 kubenswrapper[7320]: I0312 12:23:40.598871 7320 scope.go:117] "RemoveContainer" containerID="2c2b5cd50e4b41a7c3aafd02e56e622ce6b2150721ba8e3603b831c988a04475" Mar 12 12:23:40.599429 master-0 kubenswrapper[7320]: I0312 12:23:40.599022 7320 scope.go:117] "RemoveContainer" containerID="ebdafa22bf6b7ed28319fcbd34230d3e124233b075083adefec56e18e5a788b3" Mar 12 12:23:40.599429 master-0 kubenswrapper[7320]: I0312 12:23:40.599237 7320 scope.go:117] "RemoveContainer" containerID="2ecd7f48de11aae6e5506fd79aa229be8956b481497e7ee996afbf26849c14c9" Mar 12 12:23:40.599827 master-0 kubenswrapper[7320]: I0312 12:23:40.599696 7320 scope.go:117] "RemoveContainer" containerID="78cb426b98a54442332ae7dea069dbb75e6d07a8377b812d61f3d58bf1a33d17" Mar 12 12:23:40.600114 master-0 kubenswrapper[7320]: I0312 12:23:40.599850 7320 scope.go:117] "RemoveContainer" containerID="6a3b8be971ca63800f0532603d6fc3d806dc294fa7565d4894a90520eb420540" Mar 12 12:23:40.600307 master-0 kubenswrapper[7320]: I0312 12:23:40.600123 7320 scope.go:117] "RemoveContainer" containerID="66ac5cb5ac01ba22dc5debb8648c5e910f8e7732cb1f9e6097ebc2965cc2ccbc" Mar 12 12:23:40.600695 master-0 kubenswrapper[7320]: I0312 12:23:40.600507 7320 scope.go:117] "RemoveContainer" containerID="fc4b4e674883cb3e17ae8f6229f477c4fc095a1d76196bd6eee19ed1d8bb25c9" Mar 12 12:23:40.609568 master-0 kubenswrapper[7320]: I0312 12:23:40.601362 7320 scope.go:117] "RemoveContainer" containerID="500e92ed2e084af2e7e87754fde25e08ea680d95e255a969f1e5b249d9b80765" Mar 12 12:23:40.650288 master-0 kubenswrapper[7320]: I0312 12:23:40.650222 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 12 12:23:40.650288 master-0 kubenswrapper[7320]: I0312 12:23:40.650264 7320 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="0e0cdd06-8e29-4ad7-99bb-00c76bab51cc" Mar 12 12:23:40.672400 master-0 kubenswrapper[7320]: I0312 12:23:40.662517 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 12 12:23:40.672400 master-0 kubenswrapper[7320]: I0312 12:23:40.662575 7320 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="0e0cdd06-8e29-4ad7-99bb-00c76bab51cc" Mar 12 12:23:40.676057 master-0 kubenswrapper[7320]: I0312 12:23:40.676004 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4"] Mar 12 12:23:40.680868 master-0 kubenswrapper[7320]: I0312 12:23:40.680067 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 12 12:23:40.684590 master-0 kubenswrapper[7320]: I0312 12:23:40.682931 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 12 12:23:40.792910 master-0 kubenswrapper[7320]: I0312 12:23:40.792881 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 12 12:23:40.794526 master-0 kubenswrapper[7320]: I0312 12:23:40.794453 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 12 12:23:40.840911 master-0 kubenswrapper[7320]: I0312 12:23:40.839443 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 12 12:23:40.842976 master-0 kubenswrapper[7320]: I0312 12:23:40.842874 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 12 12:23:40.880853 master-0 kubenswrapper[7320]: I0312 12:23:40.880722 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:41.355180 master-0 kubenswrapper[7320]: I0312 12:23:41.355048 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" event={"ID":"c62edaec-38e2-4b73-8bb5-c776abfb310f","Type":"ContainerStarted","Data":"1faac8f245695a0b4273e53303a303c00cf33c7f391f25e9c09fc9c6b457b1b5"} Mar 12 12:23:41.357018 master-0 kubenswrapper[7320]: I0312 12:23:41.356979 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" event={"ID":"9b960fe2-d59e-4ee1-bd9d-455b46753cb9","Type":"ContainerStarted","Data":"adbce10bbd03706ca8976a91a838c5ca64faa97ab5841f447026b6066d7a6ac9"} Mar 12 12:23:41.359509 master-0 kubenswrapper[7320]: I0312 12:23:41.359457 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-sp7w9_9bc7dea3-1868-488c-a34b-288cde3acd35/olm-operator/0.log" Mar 12 12:23:41.359577 master-0 kubenswrapper[7320]: I0312 12:23:41.359525 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" event={"ID":"9bc7dea3-1868-488c-a34b-288cde3acd35","Type":"ContainerStarted","Data":"26eaf72d4bd1845bad1de114fe208fd2d17ff674448efed958c133545de99e55"} Mar 12 12:23:41.360173 master-0 kubenswrapper[7320]: I0312 12:23:41.360141 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:23:41.370976 master-0 kubenswrapper[7320]: I0312 12:23:41.370816 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:23:41.378251 master-0 kubenswrapper[7320]: I0312 12:23:41.378198 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" event={"ID":"a346ac54-02fe-417f-a49d-038e45b13a1d","Type":"ContainerStarted","Data":"941fa559d9d7b66bd5a072194e43c573035c3b0b8e08bb96d47d582c0ecc6967"} Mar 12 12:23:41.401205 master-0 kubenswrapper[7320]: I0312 12:23:41.401149 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" event={"ID":"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326","Type":"ContainerStarted","Data":"c7c86fd98ced412475f1149d0cc017a50f778e2d78c004abf35c9f6bce70ca8d"} Mar 12 12:23:41.403653 master-0 kubenswrapper[7320]: I0312 12:23:41.403614 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4","Type":"ContainerStarted","Data":"85782e605fa997f78fa1b00da3fcf4b854ebddeada29c56b1d38c44587c26563"} Mar 12 12:23:41.403653 master-0 kubenswrapper[7320]: I0312 12:23:41.403651 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4","Type":"ContainerStarted","Data":"d5a035c7049c39dfacabd51a96c5380843e394f28f3ad6d0e9bceda8fa427e90"} Mar 12 12:23:41.422700 master-0 kubenswrapper[7320]: I0312 12:23:41.422629 7320 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3" exitCode=2 Mar 12 12:23:41.422896 master-0 kubenswrapper[7320]: I0312 12:23:41.422765 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3"} Mar 12 12:23:41.422896 master-0 kubenswrapper[7320]: I0312 12:23:41.422805 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"a7884f3a85cd7c647eb2f8e86a22aa409d3d4d1224200c8d3e0d66ec06385f1c"} Mar 12 12:23:41.422896 master-0 kubenswrapper[7320]: I0312 12:23:41.422826 7320 scope.go:117] "RemoveContainer" containerID="7832f5ffa2c5ed0b1534228e4894dd1d7b32cd0726e9bdedd6ffb73456947fa0" Mar 12 12:23:41.432700 master-0 kubenswrapper[7320]: I0312 12:23:41.432671 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-nwk7v_d961a5f0-84b7-47d7-846b-238475947121/catalog-operator/0.log" Mar 12 12:23:41.432771 master-0 kubenswrapper[7320]: I0312 12:23:41.432745 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" event={"ID":"d961a5f0-84b7-47d7-846b-238475947121","Type":"ContainerStarted","Data":"62fba9ff5715bd51228717637c330e384d6b327f9410e3b1039aa5cc8ed7e923"} Mar 12 12:23:41.433657 master-0 kubenswrapper[7320]: I0312 12:23:41.433640 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:23:41.436555 master-0 kubenswrapper[7320]: I0312 12:23:41.436534 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-rbb5m_61ab511b-72e9-4fb9-b5de-770f49514369/network-operator/0.log" Mar 12 12:23:41.436625 master-0 kubenswrapper[7320]: I0312 12:23:41.436588 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" event={"ID":"61ab511b-72e9-4fb9-b5de-770f49514369","Type":"ContainerStarted","Data":"8741f0c5d9ae8c420d7e1565f7e1be687c422aa5e8cfa1251daef35ba81f0658"} Mar 12 12:23:41.442714 master-0 kubenswrapper[7320]: I0312 12:23:41.442697 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:23:41.442926 master-0 kubenswrapper[7320]: I0312 12:23:41.442909 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" event={"ID":"ea80247e-b4dd-45dc-8255-6e68508c8480","Type":"ContainerStarted","Data":"4a0a9175b7c0c59002eae473cc3ee48496f910f6cbb416a6394f728e07c36f5b"} Mar 12 12:23:41.444324 master-0 kubenswrapper[7320]: I0312 12:23:41.444278 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"a97fcd56-aa52-414a-b370-154c1b34c1ed","Type":"ContainerStarted","Data":"9e58c221e9a2e73d89eb52ff2e2377c97caf0ea7574d33f3dc1598a292639881"} Mar 12 12:23:41.444387 master-0 kubenswrapper[7320]: I0312 12:23:41.444333 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"a97fcd56-aa52-414a-b370-154c1b34c1ed","Type":"ContainerStarted","Data":"3cb6ab6d4b334e440fd45f035b87035c3f267128290d2900f9474bca9932ecf6"} Mar 12 12:23:41.448800 master-0 kubenswrapper[7320]: I0312 12:23:41.448778 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" event={"ID":"8a121d0d-d201-446b-97a1-e2414e599f4a","Type":"ContainerStarted","Data":"690cd59d2d0bb74dcb355914f6005a55170daed8e8752f6aea025d6658f4c87b"} Mar 12 12:23:41.451140 master-0 kubenswrapper[7320]: I0312 12:23:41.451097 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-rzmhl_51d58450-50bb-4da0-b1f6-4135fbabd856/approver/0.log" Mar 12 12:23:41.451461 master-0 kubenswrapper[7320]: I0312 12:23:41.451422 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerStarted","Data":"9ebb624a01176d8a7e4322c3422f03b83091e3497ad89b9a86f0622ff33645b0"} Mar 12 12:23:41.454602 master-0 kubenswrapper[7320]: I0312 12:23:41.454572 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_b7aa62dd-2de4-4511-a7e7-27f45fe97cc1/installer/0.log" Mar 12 12:23:41.454723 master-0 kubenswrapper[7320]: I0312 12:23:41.454605 7320 generic.go:334] "Generic (PLEG): container finished" podID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerID="1d7fe59666c56c328d192c878aec970270ae5b794f380892a172dc12ef6839ec" exitCode=1 Mar 12 12:23:41.454723 master-0 kubenswrapper[7320]: I0312 12:23:41.454644 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1","Type":"ContainerDied","Data":"1d7fe59666c56c328d192c878aec970270ae5b794f380892a172dc12ef6839ec"} Mar 12 12:23:41.459081 master-0 kubenswrapper[7320]: I0312 12:23:41.459060 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" event={"ID":"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee","Type":"ContainerStarted","Data":"394d7487c1ec275042e8351fac012f934a8a70eca956f873f66a0af75b226580"} Mar 12 12:23:41.469677 master-0 kubenswrapper[7320]: I0312 12:23:41.469615 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" event={"ID":"55bf535c-93ab-4870-a9d2-c02496d71ef0","Type":"ContainerStarted","Data":"eacbd3dbba1971884046f11cb35d6c55c398c982d4c7ebc931f8b98f3db8eaac"} Mar 12 12:23:41.519140 master-0 kubenswrapper[7320]: I0312 12:23:41.518532 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=75.51851143 podStartE2EDuration="1m15.51851143s" podCreationTimestamp="2026-03-12 12:22:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:23:41.492529672 +0000 UTC m=+144.051573543" watchObservedRunningTime="2026-03-12 12:23:41.51851143 +0000 UTC m=+144.077555311" Mar 12 12:23:41.579073 master-0 kubenswrapper[7320]: I0312 12:23:41.578983 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=70.578965219 podStartE2EDuration="1m10.578965219s" podCreationTimestamp="2026-03-12 12:22:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:23:41.577502932 +0000 UTC m=+144.136546813" watchObservedRunningTime="2026-03-12 12:23:41.578965219 +0000 UTC m=+144.138009100" Mar 12 12:23:41.767036 master-0 kubenswrapper[7320]: I0312 12:23:41.766989 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a381a5c1-050f-496f-b13f-eefa7310557f" path="/var/lib/kubelet/pods/a381a5c1-050f-496f-b13f-eefa7310557f/volumes" Mar 12 12:23:41.767725 master-0 kubenswrapper[7320]: I0312 12:23:41.767699 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a59a6bb7-f966-4208-ba85-452095404891" path="/var/lib/kubelet/pods/a59a6bb7-f966-4208-ba85-452095404891/volumes" Mar 12 12:23:42.479576 master-0 kubenswrapper[7320]: I0312 12:23:42.479509 7320 generic.go:334] "Generic (PLEG): container finished" podID="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" containerID="d357ccc688b993b9454b28bfd7fb28a5d58ecf020cbf9839477bf958a0d7b96f" exitCode=0 Mar 12 12:23:42.479990 master-0 kubenswrapper[7320]: I0312 12:23:42.479944 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" event={"ID":"0aeeef2a-f9df-4f87-b985-bd1da94c76c3","Type":"ContainerDied","Data":"d357ccc688b993b9454b28bfd7fb28a5d58ecf020cbf9839477bf958a0d7b96f"} Mar 12 12:23:42.480849 master-0 kubenswrapper[7320]: I0312 12:23:42.480328 7320 scope.go:117] "RemoveContainer" containerID="d357ccc688b993b9454b28bfd7fb28a5d58ecf020cbf9839477bf958a0d7b96f" Mar 12 12:23:42.932377 master-0 kubenswrapper[7320]: I0312 12:23:42.932344 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_b7aa62dd-2de4-4511-a7e7-27f45fe97cc1/installer/0.log" Mar 12 12:23:42.938999 master-0 kubenswrapper[7320]: I0312 12:23:42.932410 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:23:43.125712 master-0 kubenswrapper[7320]: I0312 12:23:43.125680 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kubelet-dir\") pod \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " Mar 12 12:23:43.125896 master-0 kubenswrapper[7320]: I0312 12:23:43.125728 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kube-api-access\") pod \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " Mar 12 12:23:43.125896 master-0 kubenswrapper[7320]: I0312 12:23:43.125782 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-var-lock\") pod \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\" (UID: \"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1\") " Mar 12 12:23:43.125896 master-0 kubenswrapper[7320]: I0312 12:23:43.125814 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" (UID: "b7aa62dd-2de4-4511-a7e7-27f45fe97cc1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:23:43.125896 master-0 kubenswrapper[7320]: I0312 12:23:43.125886 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-var-lock" (OuterVolumeSpecName: "var-lock") pod "b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" (UID: "b7aa62dd-2de4-4511-a7e7-27f45fe97cc1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:23:43.126060 master-0 kubenswrapper[7320]: I0312 12:23:43.125983 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:23:43.126060 master-0 kubenswrapper[7320]: I0312 12:23:43.125994 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:23:43.130178 master-0 kubenswrapper[7320]: I0312 12:23:43.130127 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" (UID: "b7aa62dd-2de4-4511-a7e7-27f45fe97cc1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:23:43.227356 master-0 kubenswrapper[7320]: I0312 12:23:43.227316 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7aa62dd-2de4-4511-a7e7-27f45fe97cc1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:23:43.489797 master-0 kubenswrapper[7320]: I0312 12:23:43.489743 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_b7aa62dd-2de4-4511-a7e7-27f45fe97cc1/installer/0.log" Mar 12 12:23:43.490083 master-0 kubenswrapper[7320]: I0312 12:23:43.489911 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"b7aa62dd-2de4-4511-a7e7-27f45fe97cc1","Type":"ContainerDied","Data":"06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7"} Mar 12 12:23:43.490083 master-0 kubenswrapper[7320]: I0312 12:23:43.489958 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7" Mar 12 12:23:43.490083 master-0 kubenswrapper[7320]: I0312 12:23:43.489967 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:23:43.492674 master-0 kubenswrapper[7320]: I0312 12:23:43.492222 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" event={"ID":"c62edaec-38e2-4b73-8bb5-c776abfb310f","Type":"ContainerStarted","Data":"3b494a9f1b81fd19a42a7bdbd404289733966fcd678f30d029c5042c6b33a7f8"} Mar 12 12:23:43.497686 master-0 kubenswrapper[7320]: I0312 12:23:43.494020 7320 generic.go:334] "Generic (PLEG): container finished" podID="ab087440-bdf2-4e2f-9a5a-434d50a2329a" containerID="c10ad00e6d5ca94dd8aee1068bcae9a35fd5744bbc9fa9703850b00e0063db31" exitCode=0 Mar 12 12:23:43.497686 master-0 kubenswrapper[7320]: I0312 12:23:43.494147 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" event={"ID":"ab087440-bdf2-4e2f-9a5a-434d50a2329a","Type":"ContainerDied","Data":"c10ad00e6d5ca94dd8aee1068bcae9a35fd5744bbc9fa9703850b00e0063db31"} Mar 12 12:23:43.497686 master-0 kubenswrapper[7320]: I0312 12:23:43.494793 7320 scope.go:117] "RemoveContainer" containerID="c10ad00e6d5ca94dd8aee1068bcae9a35fd5744bbc9fa9703850b00e0063db31" Mar 12 12:23:43.505193 master-0 kubenswrapper[7320]: I0312 12:23:43.503946 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" event={"ID":"0aeeef2a-f9df-4f87-b985-bd1da94c76c3","Type":"ContainerStarted","Data":"13d6e3d4ec61f1e849f9a3f93e128e87d996c76cefb71ba46af5b1c143ef3967"} Mar 12 12:23:43.516827 master-0 kubenswrapper[7320]: I0312 12:23:43.516714 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" podStartSLOduration=70.180310494 podStartE2EDuration="1m12.516683771s" podCreationTimestamp="2026-03-12 12:22:31 +0000 UTC" firstStartedPulling="2026-03-12 12:23:40.608442603 +0000 UTC m=+143.167486524" lastFinishedPulling="2026-03-12 12:23:42.94481592 +0000 UTC m=+145.503859801" observedRunningTime="2026-03-12 12:23:43.51434881 +0000 UTC m=+146.073392751" watchObservedRunningTime="2026-03-12 12:23:43.516683771 +0000 UTC m=+146.075727682" Mar 12 12:23:44.247118 master-0 kubenswrapper[7320]: I0312 12:23:44.247059 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:23:44.517250 master-0 kubenswrapper[7320]: I0312 12:23:44.516873 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" event={"ID":"ab087440-bdf2-4e2f-9a5a-434d50a2329a","Type":"ContainerStarted","Data":"9561b49f3e6e4bd4b2bd8099eb3e3f1cd67c706451785c2fd2a2d282caaf82ac"} Mar 12 12:23:44.528637 master-0 kubenswrapper[7320]: E0312 12:23:44.528594 7320 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:45.881688 master-0 kubenswrapper[7320]: I0312 12:23:45.881610 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:45.911420 master-0 kubenswrapper[7320]: I0312 12:23:45.911344 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:45.950208 master-0 kubenswrapper[7320]: I0312 12:23:45.950135 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=1.950116592 podStartE2EDuration="1.950116592s" podCreationTimestamp="2026-03-12 12:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:23:44.575607282 +0000 UTC m=+147.134651163" watchObservedRunningTime="2026-03-12 12:23:45.950116592 +0000 UTC m=+148.509160473" Mar 12 12:23:46.583628 master-0 kubenswrapper[7320]: I0312 12:23:46.570029 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 12 12:23:48.184923 master-0 kubenswrapper[7320]: I0312 12:23:48.184835 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:23:49.730440 master-0 kubenswrapper[7320]: I0312 12:23:49.730371 7320 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:23:49.731022 master-0 kubenswrapper[7320]: I0312 12:23:49.730820 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:23:51.422342 master-0 kubenswrapper[7320]: I0312 12:23:51.422307 7320 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:24:02.625777 master-0 kubenswrapper[7320]: I0312 12:24:02.625690 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-vpss8_a22189f2-3f35-4ea6-9892-39a1b46637e2/ingress-operator/0.log" Mar 12 12:24:02.625777 master-0 kubenswrapper[7320]: I0312 12:24:02.625755 7320 generic.go:334] "Generic (PLEG): container finished" podID="a22189f2-3f35-4ea6-9892-39a1b46637e2" containerID="97f79ecdfa3c97644b3ca23d2c5dae1dd9db4d81745183dd21308d5c06844fd7" exitCode=1 Mar 12 12:24:02.625777 master-0 kubenswrapper[7320]: I0312 12:24:02.625788 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" event={"ID":"a22189f2-3f35-4ea6-9892-39a1b46637e2","Type":"ContainerDied","Data":"97f79ecdfa3c97644b3ca23d2c5dae1dd9db4d81745183dd21308d5c06844fd7"} Mar 12 12:24:02.626746 master-0 kubenswrapper[7320]: I0312 12:24:02.626237 7320 scope.go:117] "RemoveContainer" containerID="97f79ecdfa3c97644b3ca23d2c5dae1dd9db4d81745183dd21308d5c06844fd7" Mar 12 12:24:03.632788 master-0 kubenswrapper[7320]: I0312 12:24:03.632701 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-vpss8_a22189f2-3f35-4ea6-9892-39a1b46637e2/ingress-operator/0.log" Mar 12 12:24:03.632788 master-0 kubenswrapper[7320]: I0312 12:24:03.632771 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" event={"ID":"a22189f2-3f35-4ea6-9892-39a1b46637e2","Type":"ContainerStarted","Data":"966b8227bb6cf02c5c38f30692e11c73156d14d213e9c25e6d96b07db69fb8d3"} Mar 12 12:24:10.437879 master-0 kubenswrapper[7320]: I0312 12:24:10.437832 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d"] Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: E0312 12:24:10.438053 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a381a5c1-050f-496f-b13f-eefa7310557f" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438068 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="a381a5c1-050f-496f-b13f-eefa7310557f" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: E0312 12:24:10.438091 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438098 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: E0312 12:24:10.438112 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438119 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: E0312 12:24:10.438131 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438139 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438224 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438243 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="a381a5c1-050f-496f-b13f-eefa7310557f" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438255 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438266 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerName="installer" Mar 12 12:24:10.442383 master-0 kubenswrapper[7320]: I0312 12:24:10.438872 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.447576 master-0 kubenswrapper[7320]: I0312 12:24:10.447534 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft"] Mar 12 12:24:10.450131 master-0 kubenswrapper[7320]: I0312 12:24:10.449307 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 12:24:10.451130 master-0 kubenswrapper[7320]: I0312 12:24:10.449376 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 12:24:10.451130 master-0 kubenswrapper[7320]: I0312 12:24:10.449410 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-pmfnh" Mar 12 12:24:10.451130 master-0 kubenswrapper[7320]: I0312 12:24:10.449450 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 12:24:10.451130 master-0 kubenswrapper[7320]: I0312 12:24:10.449542 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 12:24:10.451130 master-0 kubenswrapper[7320]: I0312 12:24:10.449588 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 12:24:10.452288 master-0 kubenswrapper[7320]: I0312 12:24:10.452269 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.454532 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m"] Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.455430 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.457859 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll"] Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.459410 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.459593 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.460119 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-qwrgw" Mar 12 12:24:10.460300 master-0 kubenswrapper[7320]: I0312 12:24:10.460237 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 12 12:24:10.464597 master-0 kubenswrapper[7320]: I0312 12:24:10.463696 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 12 12:24:10.464597 master-0 kubenswrapper[7320]: I0312 12:24:10.464100 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:24:10.467221 master-0 kubenswrapper[7320]: I0312 12:24:10.466892 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 12:24:10.467356 master-0 kubenswrapper[7320]: I0312 12:24:10.467324 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 12:24:10.467585 master-0 kubenswrapper[7320]: I0312 12:24:10.467561 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 12:24:10.467649 master-0 kubenswrapper[7320]: I0312 12:24:10.467595 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-8fdxz" Mar 12 12:24:10.468087 master-0 kubenswrapper[7320]: I0312 12:24:10.468058 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw"] Mar 12 12:24:10.476587 master-0 kubenswrapper[7320]: I0312 12:24:10.468527 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll"] Mar 12 12:24:10.476587 master-0 kubenswrapper[7320]: I0312 12:24:10.468592 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.476587 master-0 kubenswrapper[7320]: I0312 12:24:10.468603 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.476587 master-0 kubenswrapper[7320]: I0312 12:24:10.470212 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 12 12:24:10.484683 master-0 kubenswrapper[7320]: I0312 12:24:10.484638 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-7skrh" Mar 12 12:24:10.484944 master-0 kubenswrapper[7320]: I0312 12:24:10.484914 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 12 12:24:10.485161 master-0 kubenswrapper[7320]: I0312 12:24:10.485140 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 12 12:24:10.485462 master-0 kubenswrapper[7320]: I0312 12:24:10.484567 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 12 12:24:10.485912 master-0 kubenswrapper[7320]: I0312 12:24:10.485890 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 12 12:24:10.493403 master-0 kubenswrapper[7320]: I0312 12:24:10.492831 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m"] Mar 12 12:24:10.493403 master-0 kubenswrapper[7320]: I0312 12:24:10.492884 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw"] Mar 12 12:24:10.494894 master-0 kubenswrapper[7320]: I0312 12:24:10.494838 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.494894 master-0 kubenswrapper[7320]: I0312 12:24:10.494889 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s28gq\" (UniqueName: \"kubernetes.io/projected/f19c3c89-8d32-4394-bd86-e5ef7734c42b-kube-api-access-s28gq\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.495016 master-0 kubenswrapper[7320]: I0312 12:24:10.494914 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.495016 master-0 kubenswrapper[7320]: I0312 12:24:10.494933 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.495016 master-0 kubenswrapper[7320]: I0312 12:24:10.494957 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5jgq\" (UniqueName: \"kubernetes.io/projected/81cb0504-9455-4398-aed1-5cc6790f292e-kube-api-access-j5jgq\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.495016 master-0 kubenswrapper[7320]: I0312 12:24:10.494979 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/81cb0504-9455-4398-aed1-5cc6790f292e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.495177 master-0 kubenswrapper[7320]: I0312 12:24:10.495074 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.495177 master-0 kubenswrapper[7320]: I0312 12:24:10.495099 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd4jz\" (UniqueName: \"kubernetes.io/projected/02d5a507-4409-44b4-98bc-1751cdcc6c6a-kube-api-access-jd4jz\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.495177 master-0 kubenswrapper[7320]: I0312 12:24:10.495119 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.495298 master-0 kubenswrapper[7320]: I0312 12:24:10.495174 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.495298 master-0 kubenswrapper[7320]: I0312 12:24:10.495219 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.495381 master-0 kubenswrapper[7320]: I0312 12:24:10.495315 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.495381 master-0 kubenswrapper[7320]: I0312 12:24:10.495338 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/81cb0504-9455-4398-aed1-5cc6790f292e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.495381 master-0 kubenswrapper[7320]: I0312 12:24:10.495372 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b704e7-1291-4645-8a0d-2a937829d7ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.495558 master-0 kubenswrapper[7320]: I0312 12:24:10.495407 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb249\" (UniqueName: \"kubernetes.io/projected/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-kube-api-access-tb249\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.495558 master-0 kubenswrapper[7320]: I0312 12:24:10.495439 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zct7x\" (UniqueName: \"kubernetes.io/projected/f3b704e7-1291-4645-8a0d-2a937829d7ac-kube-api-access-zct7x\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.495743 master-0 kubenswrapper[7320]: I0312 12:24:10.495710 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-drmbk" Mar 12 12:24:10.533800 master-0 kubenswrapper[7320]: I0312 12:24:10.532449 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk"] Mar 12 12:24:10.533800 master-0 kubenswrapper[7320]: I0312 12:24:10.533172 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.537390 master-0 kubenswrapper[7320]: I0312 12:24:10.535470 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 12 12:24:10.537390 master-0 kubenswrapper[7320]: I0312 12:24:10.535714 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 12 12:24:10.541150 master-0 kubenswrapper[7320]: I0312 12:24:10.541073 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-4vzl8"] Mar 12 12:24:10.541340 master-0 kubenswrapper[7320]: I0312 12:24:10.541248 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-dk5lp" Mar 12 12:24:10.543372 master-0 kubenswrapper[7320]: I0312 12:24:10.543218 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.543999 master-0 kubenswrapper[7320]: I0312 12:24:10.543821 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd"] Mar 12 12:24:10.544585 master-0 kubenswrapper[7320]: I0312 12:24:10.544564 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.546315 master-0 kubenswrapper[7320]: I0312 12:24:10.545661 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw"] Mar 12 12:24:10.546315 master-0 kubenswrapper[7320]: I0312 12:24:10.546268 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.546636 master-0 kubenswrapper[7320]: I0312 12:24:10.546615 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 12:24:10.546798 master-0 kubenswrapper[7320]: I0312 12:24:10.546780 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-srrbs" Mar 12 12:24:10.547002 master-0 kubenswrapper[7320]: I0312 12:24:10.546956 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 12 12:24:10.547105 master-0 kubenswrapper[7320]: I0312 12:24:10.547089 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 12:24:10.547286 master-0 kubenswrapper[7320]: I0312 12:24:10.547272 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 12:24:10.547332 master-0 kubenswrapper[7320]: I0312 12:24:10.547280 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-hjhdl" Mar 12 12:24:10.547364 master-0 kubenswrapper[7320]: I0312 12:24:10.547346 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 12 12:24:10.547626 master-0 kubenswrapper[7320]: I0312 12:24:10.547609 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 12:24:10.547741 master-0 kubenswrapper[7320]: I0312 12:24:10.547726 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 12 12:24:10.547789 master-0 kubenswrapper[7320]: I0312 12:24:10.547771 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 12:24:10.547832 master-0 kubenswrapper[7320]: I0312 12:24:10.547732 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p"] Mar 12 12:24:10.547915 master-0 kubenswrapper[7320]: I0312 12:24:10.547900 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 12 12:24:10.548048 master-0 kubenswrapper[7320]: I0312 12:24:10.548034 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 12 12:24:10.548372 master-0 kubenswrapper[7320]: I0312 12:24:10.548345 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.550177 master-0 kubenswrapper[7320]: I0312 12:24:10.550152 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 12:24:10.550361 master-0 kubenswrapper[7320]: I0312 12:24:10.550320 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 12:24:10.550400 master-0 kubenswrapper[7320]: I0312 12:24:10.550387 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 12:24:10.550494 master-0 kubenswrapper[7320]: I0312 12:24:10.550450 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 12 12:24:10.550754 master-0 kubenswrapper[7320]: I0312 12:24:10.550575 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 12 12:24:10.550754 master-0 kubenswrapper[7320]: I0312 12:24:10.550598 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 12 12:24:10.550754 master-0 kubenswrapper[7320]: I0312 12:24:10.550670 7320 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 12 12:24:10.550754 master-0 kubenswrapper[7320]: I0312 12:24:10.550500 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-k9zjm" Mar 12 12:24:10.550864 master-0 kubenswrapper[7320]: I0312 12:24:10.550780 7320 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dbn77" Mar 12 12:24:10.569834 master-0 kubenswrapper[7320]: I0312 12:24:10.567255 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk"] Mar 12 12:24:10.575456 master-0 kubenswrapper[7320]: I0312 12:24:10.575414 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd"] Mar 12 12:24:10.580934 master-0 kubenswrapper[7320]: I0312 12:24:10.580805 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw"] Mar 12 12:24:10.585622 master-0 kubenswrapper[7320]: I0312 12:24:10.585337 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-4vzl8"] Mar 12 12:24:10.593518 master-0 kubenswrapper[7320]: I0312 12:24:10.589607 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p"] Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.595893 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.595932 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.595957 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-images\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596030 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596098 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bab9dba-235f-467c-9224-634cca9acbd2-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596146 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-config\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596176 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns72p\" (UniqueName: \"kubernetes.io/projected/99a11fe6-48a1-439e-b788-158dbe267dcd-kube-api-access-ns72p\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596213 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28gq\" (UniqueName: \"kubernetes.io/projected/f19c3c89-8d32-4394-bd86-e5ef7734c42b-kube-api-access-s28gq\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596249 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596280 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596304 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-images\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596341 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596379 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5jgq\" (UniqueName: \"kubernetes.io/projected/81cb0504-9455-4398-aed1-5cc6790f292e-kube-api-access-j5jgq\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596407 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/632651f7-6641-49d8-9c48-7f6ea5846538-cert\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596441 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/81cb0504-9455-4398-aed1-5cc6790f292e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596495 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596538 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd4jz\" (UniqueName: \"kubernetes.io/projected/02d5a507-4409-44b4-98bc-1751cdcc6c6a-kube-api-access-jd4jz\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596589 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596598 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596628 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a11fe6-48a1-439e-b788-158dbe267dcd-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596655 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596682 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596707 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv9kn\" (UniqueName: \"kubernetes.io/projected/2bab9dba-235f-467c-9224-634cca9acbd2-kube-api-access-cv9kn\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596734 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596766 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596788 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86k82\" (UniqueName: \"kubernetes.io/projected/632651f7-6641-49d8-9c48-7f6ea5846538-kube-api-access-86k82\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596818 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/81cb0504-9455-4398-aed1-5cc6790f292e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596856 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b704e7-1291-4645-8a0d-2a937829d7ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596883 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68c57a64-f30c-4caf-89ef-08bd0d36833e-snapshots\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596915 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-config\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596941 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4nbb\" (UniqueName: \"kubernetes.io/projected/68c57a64-f30c-4caf-89ef-08bd0d36833e-kube-api-access-x4nbb\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.596975 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb249\" (UniqueName: \"kubernetes.io/projected/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-kube-api-access-tb249\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.597014 master-0 kubenswrapper[7320]: I0312 12:24:10.597017 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94src\" (UniqueName: \"kubernetes.io/projected/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-kube-api-access-94src\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597056 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597091 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zct7x\" (UniqueName: \"kubernetes.io/projected/f3b704e7-1291-4645-8a0d-2a937829d7ac-kube-api-access-zct7x\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597115 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-images\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597152 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597176 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68c57a64-f30c-4caf-89ef-08bd0d36833e-serving-cert\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597268 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.598030 master-0 kubenswrapper[7320]: I0312 12:24:10.597316 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/81cb0504-9455-4398-aed1-5cc6790f292e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.598268 master-0 kubenswrapper[7320]: I0312 12:24:10.598217 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.598913 master-0 kubenswrapper[7320]: I0312 12:24:10.598885 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.599589 master-0 kubenswrapper[7320]: I0312 12:24:10.599528 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.604007 master-0 kubenswrapper[7320]: I0312 12:24:10.602718 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.604578 master-0 kubenswrapper[7320]: I0312 12:24:10.604514 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.604880 master-0 kubenswrapper[7320]: I0312 12:24:10.604839 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b704e7-1291-4645-8a0d-2a937829d7ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.619576 master-0 kubenswrapper[7320]: I0312 12:24:10.616240 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/81cb0504-9455-4398-aed1-5cc6790f292e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.620983 master-0 kubenswrapper[7320]: I0312 12:24:10.620948 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5jgq\" (UniqueName: \"kubernetes.io/projected/81cb0504-9455-4398-aed1-5cc6790f292e-kube-api-access-j5jgq\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.621128 master-0 kubenswrapper[7320]: I0312 12:24:10.621102 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zct7x\" (UniqueName: \"kubernetes.io/projected/f3b704e7-1291-4645-8a0d-2a937829d7ac-kube-api-access-zct7x\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.622416 master-0 kubenswrapper[7320]: I0312 12:24:10.622346 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.624043 master-0 kubenswrapper[7320]: I0312 12:24:10.624011 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28gq\" (UniqueName: \"kubernetes.io/projected/f19c3c89-8d32-4394-bd86-e5ef7734c42b-kube-api-access-s28gq\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.636523 master-0 kubenswrapper[7320]: I0312 12:24:10.635352 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd4jz\" (UniqueName: \"kubernetes.io/projected/02d5a507-4409-44b4-98bc-1751cdcc6c6a-kube-api-access-jd4jz\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.636523 master-0 kubenswrapper[7320]: I0312 12:24:10.636040 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb249\" (UniqueName: \"kubernetes.io/projected/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-kube-api-access-tb249\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.698432 master-0 kubenswrapper[7320]: I0312 12:24:10.698372 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.698432 master-0 kubenswrapper[7320]: I0312 12:24:10.698418 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-images\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.698432 master-0 kubenswrapper[7320]: I0312 12:24:10.698438 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698455 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68c57a64-f30c-4caf-89ef-08bd0d36833e-serving-cert\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698471 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698502 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698523 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-images\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698545 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bab9dba-235f-467c-9224-634cca9acbd2-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698564 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-config\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698579 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns72p\" (UniqueName: \"kubernetes.io/projected/99a11fe6-48a1-439e-b788-158dbe267dcd-kube-api-access-ns72p\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698600 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-images\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698622 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698640 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/632651f7-6641-49d8-9c48-7f6ea5846538-cert\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698664 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a11fe6-48a1-439e-b788-158dbe267dcd-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.698685 master-0 kubenswrapper[7320]: I0312 12:24:10.698681 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.699017 master-0 kubenswrapper[7320]: I0312 12:24:10.698701 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv9kn\" (UniqueName: \"kubernetes.io/projected/2bab9dba-235f-467c-9224-634cca9acbd2-kube-api-access-cv9kn\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.699017 master-0 kubenswrapper[7320]: I0312 12:24:10.698720 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86k82\" (UniqueName: \"kubernetes.io/projected/632651f7-6641-49d8-9c48-7f6ea5846538-kube-api-access-86k82\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.699017 master-0 kubenswrapper[7320]: I0312 12:24:10.698739 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68c57a64-f30c-4caf-89ef-08bd0d36833e-snapshots\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.699017 master-0 kubenswrapper[7320]: I0312 12:24:10.698756 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-config\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.699017 master-0 kubenswrapper[7320]: I0312 12:24:10.698773 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nbb\" (UniqueName: \"kubernetes.io/projected/68c57a64-f30c-4caf-89ef-08bd0d36833e-kube-api-access-x4nbb\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.699017 master-0 kubenswrapper[7320]: I0312 12:24:10.698789 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94src\" (UniqueName: \"kubernetes.io/projected/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-kube-api-access-94src\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.699602 master-0 kubenswrapper[7320]: I0312 12:24:10.699573 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.700082 master-0 kubenswrapper[7320]: I0312 12:24:10.700049 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-config\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.700148 master-0 kubenswrapper[7320]: I0312 12:24:10.700053 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.700148 master-0 kubenswrapper[7320]: I0312 12:24:10.700127 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.700463 master-0 kubenswrapper[7320]: I0312 12:24:10.700444 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-images\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.701330 master-0 kubenswrapper[7320]: I0312 12:24:10.700685 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.702008 master-0 kubenswrapper[7320]: I0312 12:24:10.700868 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68c57a64-f30c-4caf-89ef-08bd0d36833e-snapshots\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.702130 master-0 kubenswrapper[7320]: I0312 12:24:10.701276 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-config\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.702836 master-0 kubenswrapper[7320]: I0312 12:24:10.702814 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-images\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.703407 master-0 kubenswrapper[7320]: I0312 12:24:10.703389 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68c57a64-f30c-4caf-89ef-08bd0d36833e-serving-cert\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.704018 master-0 kubenswrapper[7320]: I0312 12:24:10.703969 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-images\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.704678 master-0 kubenswrapper[7320]: I0312 12:24:10.704634 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.705013 master-0 kubenswrapper[7320]: I0312 12:24:10.704974 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.705215 master-0 kubenswrapper[7320]: I0312 12:24:10.705177 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bab9dba-235f-467c-9224-634cca9acbd2-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.705786 master-0 kubenswrapper[7320]: I0312 12:24:10.705756 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/632651f7-6641-49d8-9c48-7f6ea5846538-cert\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.706747 master-0 kubenswrapper[7320]: I0312 12:24:10.706708 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a11fe6-48a1-439e-b788-158dbe267dcd-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.713083 master-0 kubenswrapper[7320]: I0312 12:24:10.713011 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94src\" (UniqueName: \"kubernetes.io/projected/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-kube-api-access-94src\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:10.716284 master-0 kubenswrapper[7320]: I0312 12:24:10.716246 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv9kn\" (UniqueName: \"kubernetes.io/projected/2bab9dba-235f-467c-9224-634cca9acbd2-kube-api-access-cv9kn\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.716723 master-0 kubenswrapper[7320]: I0312 12:24:10.716700 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns72p\" (UniqueName: \"kubernetes.io/projected/99a11fe6-48a1-439e-b788-158dbe267dcd-kube-api-access-ns72p\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.718951 master-0 kubenswrapper[7320]: I0312 12:24:10.718547 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86k82\" (UniqueName: \"kubernetes.io/projected/632651f7-6641-49d8-9c48-7f6ea5846538-kube-api-access-86k82\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.722398 master-0 kubenswrapper[7320]: I0312 12:24:10.722351 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nbb\" (UniqueName: \"kubernetes.io/projected/68c57a64-f30c-4caf-89ef-08bd0d36833e-kube-api-access-x4nbb\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.768648 master-0 kubenswrapper[7320]: I0312 12:24:10.768579 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:10.787290 master-0 kubenswrapper[7320]: W0312 12:24:10.787238 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6243e45c_6e83_4fe0_b619_f7bf9e5d4dbc.slice/crio-a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5 WatchSource:0}: Error finding container a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5: Status 404 returned error can't find the container with id a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5 Mar 12 12:24:10.793182 master-0 kubenswrapper[7320]: I0312 12:24:10.792905 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:10.811933 master-0 kubenswrapper[7320]: I0312 12:24:10.811819 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:10.821873 master-0 kubenswrapper[7320]: I0312 12:24:10.821831 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:10.830918 master-0 kubenswrapper[7320]: I0312 12:24:10.830872 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:10.844799 master-0 kubenswrapper[7320]: W0312 12:24:10.844750 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81cb0504_9455_4398_aed1_5cc6790f292e.slice/crio-d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b WatchSource:0}: Error finding container d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b: Status 404 returned error can't find the container with id d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b Mar 12 12:24:10.850846 master-0 kubenswrapper[7320]: I0312 12:24:10.850804 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:10.864868 master-0 kubenswrapper[7320]: I0312 12:24:10.864402 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:10.874884 master-0 kubenswrapper[7320]: I0312 12:24:10.874834 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:10.889022 master-0 kubenswrapper[7320]: I0312 12:24:10.888725 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:10.952776 master-0 kubenswrapper[7320]: I0312 12:24:10.950707 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:11.267802 master-0 kubenswrapper[7320]: I0312 12:24:11.267769 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll"] Mar 12 12:24:11.271421 master-0 kubenswrapper[7320]: W0312 12:24:11.271388 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02d5a507_4409_44b4_98bc_1751cdcc6c6a.slice/crio-f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666 WatchSource:0}: Error finding container f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666: Status 404 returned error can't find the container with id f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666 Mar 12 12:24:11.366856 master-0 kubenswrapper[7320]: I0312 12:24:11.366795 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m"] Mar 12 12:24:11.388062 master-0 kubenswrapper[7320]: I0312 12:24:11.388030 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw"] Mar 12 12:24:11.401759 master-0 kubenswrapper[7320]: W0312 12:24:11.401704 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b704e7_1291_4645_8a0d_2a937829d7ac.slice/crio-a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc WatchSource:0}: Error finding container a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc: Status 404 returned error can't find the container with id a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc Mar 12 12:24:11.458586 master-0 kubenswrapper[7320]: I0312 12:24:11.454036 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-4vzl8"] Mar 12 12:24:11.472443 master-0 kubenswrapper[7320]: I0312 12:24:11.468704 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk"] Mar 12 12:24:11.643268 master-0 kubenswrapper[7320]: I0312 12:24:11.643203 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd"] Mar 12 12:24:11.678842 master-0 kubenswrapper[7320]: I0312 12:24:11.678723 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw"] Mar 12 12:24:11.680274 master-0 kubenswrapper[7320]: I0312 12:24:11.680222 7320 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p"] Mar 12 12:24:11.696041 master-0 kubenswrapper[7320]: I0312 12:24:11.693667 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" event={"ID":"68c57a64-f30c-4caf-89ef-08bd0d36833e","Type":"ContainerStarted","Data":"437f19dee9aaecaf0da0f9a4e2b870d07ad9f109bec28fd8d9b704649289798a"} Mar 12 12:24:11.696041 master-0 kubenswrapper[7320]: W0312 12:24:11.694936 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bab9dba_235f_467c_9224_634cca9acbd2.slice/crio-4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4 WatchSource:0}: Error finding container 4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4: Status 404 returned error can't find the container with id 4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4 Mar 12 12:24:11.696545 master-0 kubenswrapper[7320]: I0312 12:24:11.696505 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" event={"ID":"02d5a507-4409-44b4-98bc-1751cdcc6c6a","Type":"ContainerStarted","Data":"c7cc7d118cf57e855a01e00be780d9d56060ab25c5b8731249eeebb11db93bd7"} Mar 12 12:24:11.696620 master-0 kubenswrapper[7320]: I0312 12:24:11.696553 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" event={"ID":"02d5a507-4409-44b4-98bc-1751cdcc6c6a","Type":"ContainerStarted","Data":"f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666"} Mar 12 12:24:11.700040 master-0 kubenswrapper[7320]: I0312 12:24:11.700001 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" event={"ID":"81cb0504-9455-4398-aed1-5cc6790f292e","Type":"ContainerStarted","Data":"d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b"} Mar 12 12:24:11.706816 master-0 kubenswrapper[7320]: I0312 12:24:11.706043 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" event={"ID":"99a11fe6-48a1-439e-b788-158dbe267dcd","Type":"ContainerStarted","Data":"ad38671dc0e01021059e694690e391eb7e5f662e10f37c5a4ec0d76fc36b9929"} Mar 12 12:24:11.759409 master-0 kubenswrapper[7320]: I0312 12:24:11.759301 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" event={"ID":"f19c3c89-8d32-4394-bd86-e5ef7734c42b","Type":"ContainerStarted","Data":"cd26eb3215c3045fef95e54a836428d267922b388ae43b971d77bc3784523536"} Mar 12 12:24:11.777990 master-0 kubenswrapper[7320]: I0312 12:24:11.777826 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerStarted","Data":"8b83ded6f46446c00701c35152871cb98d8791222cb69b9949e8eab3e638a17e"} Mar 12 12:24:11.777990 master-0 kubenswrapper[7320]: I0312 12:24:11.777865 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerStarted","Data":"48802dfcb8b0bd5f567eee89b3f4b5a9769fbf82dea70b650fc097ec0ea21366"} Mar 12 12:24:11.777990 master-0 kubenswrapper[7320]: I0312 12:24:11.777874 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" event={"ID":"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc","Type":"ContainerStarted","Data":"e416ed21082c7fbbba5cf6267742a6950866281043fbfe9303b75b74df1337d4"} Mar 12 12:24:11.777990 master-0 kubenswrapper[7320]: I0312 12:24:11.777884 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" event={"ID":"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc","Type":"ContainerStarted","Data":"a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5"} Mar 12 12:24:11.780983 master-0 kubenswrapper[7320]: I0312 12:24:11.780451 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" event={"ID":"f3b704e7-1291-4645-8a0d-2a937829d7ac","Type":"ContainerStarted","Data":"a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc"} Mar 12 12:24:12.789119 master-0 kubenswrapper[7320]: I0312 12:24:12.789068 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" event={"ID":"99a11fe6-48a1-439e-b788-158dbe267dcd","Type":"ContainerStarted","Data":"a01acaafaae1484c4af39c06e64d8d57ddc19774186ade0622b545a43f50fa34"} Mar 12 12:24:12.789119 master-0 kubenswrapper[7320]: I0312 12:24:12.789124 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" event={"ID":"99a11fe6-48a1-439e-b788-158dbe267dcd","Type":"ContainerStarted","Data":"1bc386ffa63a2ec11752a9c7ca7b3a646adea9e87a3d6e20220e3af8fe3c25ee"} Mar 12 12:24:12.792093 master-0 kubenswrapper[7320]: I0312 12:24:12.790846 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" event={"ID":"2bab9dba-235f-467c-9224-634cca9acbd2","Type":"ContainerStarted","Data":"706dd66630b99f7068924980b17589a12da31fdc11cbebfddea78635a8cf8735"} Mar 12 12:24:12.792093 master-0 kubenswrapper[7320]: I0312 12:24:12.790929 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" event={"ID":"2bab9dba-235f-467c-9224-634cca9acbd2","Type":"ContainerStarted","Data":"4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4"} Mar 12 12:24:12.808723 master-0 kubenswrapper[7320]: I0312 12:24:12.808645 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerStarted","Data":"2ada9e185ebcd7796052d553750f50a0d8e59b4d4c070029c61ef23cd75f5a22"} Mar 12 12:24:12.900142 master-0 kubenswrapper[7320]: I0312 12:24:12.898492 7320 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" podStartSLOduration=2.898455306 podStartE2EDuration="2.898455306s" podCreationTimestamp="2026-03-12 12:24:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:24:12.894134187 +0000 UTC m=+175.453178088" watchObservedRunningTime="2026-03-12 12:24:12.898455306 +0000 UTC m=+175.457499187" Mar 12 12:24:13.875854 master-0 kubenswrapper[7320]: I0312 12:24:13.875804 7320 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 12 12:24:13.876312 master-0 kubenswrapper[7320]: I0312 12:24:13.876055 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://a7884f3a85cd7c647eb2f8e86a22aa409d3d4d1224200c8d3e0d66ec06385f1c" gracePeriod=30 Mar 12 12:24:13.876359 master-0 kubenswrapper[7320]: I0312 12:24:13.876341 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://96e48ce640071ce1e3c5822f8f356a843319ddfef771dec6cce93f02b2946dac" gracePeriod=30 Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.876684 7320 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: E0312 12:24:13.876868 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.876883 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: E0312 12:24:13.876905 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.876913 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: E0312 12:24:13.876922 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.876929 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.877033 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.877072 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.877103 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: E0312 12:24:13.877223 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877269 master-0 kubenswrapper[7320]: I0312 12:24:13.877233 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.877643 master-0 kubenswrapper[7320]: I0312 12:24:13.877466 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 12 12:24:13.878644 master-0 kubenswrapper[7320]: I0312 12:24:13.878613 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.057175 master-0 kubenswrapper[7320]: I0312 12:24:14.057131 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.057408 master-0 kubenswrapper[7320]: I0312 12:24:14.057377 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.087665 master-0 kubenswrapper[7320]: I0312 12:24:14.087467 7320 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": dial tcp [::1]:10357: connect: connection refused" Mar 12 12:24:14.095689 master-0 kubenswrapper[7320]: I0312 12:24:14.095598 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:24:14.158992 master-0 kubenswrapper[7320]: I0312 12:24:14.158915 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.158992 master-0 kubenswrapper[7320]: I0312 12:24:14.158997 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.159196 master-0 kubenswrapper[7320]: I0312 12:24:14.159040 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.159196 master-0 kubenswrapper[7320]: I0312 12:24:14.159044 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:14.393537 master-0 kubenswrapper[7320]: I0312 12:24:14.393423 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:15.836603 master-0 kubenswrapper[7320]: I0312 12:24:15.836100 7320 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="a7884f3a85cd7c647eb2f8e86a22aa409d3d4d1224200c8d3e0d66ec06385f1c" exitCode=0 Mar 12 12:24:15.836603 master-0 kubenswrapper[7320]: I0312 12:24:15.836159 7320 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="96e48ce640071ce1e3c5822f8f356a843319ddfef771dec6cce93f02b2946dac" exitCode=0 Mar 12 12:24:15.836603 master-0 kubenswrapper[7320]: I0312 12:24:15.836237 7320 scope.go:117] "RemoveContainer" containerID="b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3" Mar 12 12:24:17.854314 master-0 kubenswrapper[7320]: I0312 12:24:17.854211 7320 generic.go:334] "Generic (PLEG): container finished" podID="a97fcd56-aa52-414a-b370-154c1b34c1ed" containerID="9e58c221e9a2e73d89eb52ff2e2377c97caf0ea7574d33f3dc1598a292639881" exitCode=0 Mar 12 12:24:17.854314 master-0 kubenswrapper[7320]: I0312 12:24:17.854261 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"a97fcd56-aa52-414a-b370-154c1b34c1ed","Type":"ContainerDied","Data":"9e58c221e9a2e73d89eb52ff2e2377c97caf0ea7574d33f3dc1598a292639881"} Mar 12 12:24:18.807064 master-0 kubenswrapper[7320]: I0312 12:24:18.806984 7320 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:24:18.807649 master-0 kubenswrapper[7320]: I0312 12:24:18.807616 7320 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 12 12:24:18.807828 master-0 kubenswrapper[7320]: I0312 12:24:18.807782 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.807925 master-0 kubenswrapper[7320]: I0312 12:24:18.807862 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12" gracePeriod=15 Mar 12 12:24:18.807989 master-0 kubenswrapper[7320]: I0312 12:24:18.807907 7320 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f" gracePeriod=15 Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.809762 7320 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: E0312 12:24:18.810027 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.810043 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: E0312 12:24:18.810060 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.810069 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: E0312 12:24:18.810082 7320 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.810090 7320 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.810185 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.810204 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.810217 7320 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 12 12:24:18.811936 master-0 kubenswrapper[7320]: I0312 12:24:18.811766 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.817909 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.817973 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.817995 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.818015 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.818029 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.818047 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.818073 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.818181 master-0 kubenswrapper[7320]: I0312 12:24:18.818090 7320 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.919082 master-0 kubenswrapper[7320]: I0312 12:24:18.918944 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.919082 master-0 kubenswrapper[7320]: I0312 12:24:18.919034 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920021 master-0 kubenswrapper[7320]: I0312 12:24:18.919175 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.920021 master-0 kubenswrapper[7320]: I0312 12:24:18.919242 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920021 master-0 kubenswrapper[7320]: I0312 12:24:18.919729 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920021 master-0 kubenswrapper[7320]: I0312 12:24:18.919822 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920021 master-0 kubenswrapper[7320]: I0312 12:24:18.919860 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920021 master-0 kubenswrapper[7320]: I0312 12:24:18.919917 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920461 master-0 kubenswrapper[7320]: I0312 12:24:18.920039 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.920461 master-0 kubenswrapper[7320]: I0312 12:24:18.920113 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920461 master-0 kubenswrapper[7320]: I0312 12:24:18.920164 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.920461 master-0 kubenswrapper[7320]: I0312 12:24:18.920241 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920461 master-0 kubenswrapper[7320]: I0312 12:24:18.920284 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.920851 master-0 kubenswrapper[7320]: I0312 12:24:18.920452 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:18.920851 master-0 kubenswrapper[7320]: I0312 12:24:18.920455 7320 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:18.920851 master-0 kubenswrapper[7320]: I0312 12:24:18.920551 7320 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:19.073002 master-0 kubenswrapper[7320]: I0312 12:24:19.072938 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:24:19.121912 master-0 kubenswrapper[7320]: I0312 12:24:19.121828 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 12 12:24:19.121912 master-0 kubenswrapper[7320]: I0312 12:24:19.121908 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.121976 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.121991 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.122030 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.122061 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.122132 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.122158 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.122266 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:19.122396 master-0 kubenswrapper[7320]: I0312 12:24:19.122371 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:19.122976 master-0 kubenswrapper[7320]: I0312 12:24:19.122626 7320 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:19.122976 master-0 kubenswrapper[7320]: I0312 12:24:19.122660 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:19.122976 master-0 kubenswrapper[7320]: I0312 12:24:19.122677 7320 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:19.122976 master-0 kubenswrapper[7320]: I0312 12:24:19.122698 7320 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:19.122976 master-0 kubenswrapper[7320]: I0312 12:24:19.122717 7320 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:19.570588 master-0 kubenswrapper[7320]: I0312 12:24:19.570519 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:19.570588 master-0 kubenswrapper[7320]: I0312 12:24:19.570571 7320 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:19.572721 master-0 kubenswrapper[7320]: I0312 12:24:19.572620 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:24:19.574633 master-0 kubenswrapper[7320]: I0312 12:24:19.574565 7320 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 12 12:24:19.763041 master-0 kubenswrapper[7320]: I0312 12:24:19.762984 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 12 12:24:19.763421 master-0 kubenswrapper[7320]: I0312 12:24:19.763400 7320 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 12 12:24:19.838620 master-0 kubenswrapper[7320]: I0312 12:24:19.838569 7320 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 12 12:24:19.838837 master-0 kubenswrapper[7320]: I0312 12:24:19.838611 7320 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="aa48dbad-8c7f-4680-ad9e-963068fcc530" Mar 12 12:24:19.843819 master-0 kubenswrapper[7320]: I0312 12:24:19.843754 7320 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 12 12:24:19.843819 master-0 kubenswrapper[7320]: I0312 12:24:19.843811 7320 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="aa48dbad-8c7f-4680-ad9e-963068fcc530" Mar 12 12:24:19.874310 master-0 kubenswrapper[7320]: I0312 12:24:19.874279 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 12 12:24:19.876908 master-0 kubenswrapper[7320]: I0312 12:24:19.876866 7320 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f" exitCode=0 Mar 12 12:24:21.876495 master-0 kubenswrapper[7320]: E0312 12:24:21.876307 7320 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{cluster-autoscaler-operator-69576476f7-ph7gk.189c179050c4792c openshift-machine-api 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-api,Name:cluster-autoscaler-operator-69576476f7-ph7gk,UID:632651f7-6641-49d8-9c48-7f6ea5846538,APIVersion:v1,ResourceVersion:9536,FieldPath:spec.containers{cluster-autoscaler-operator},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:00b591b3820682dc99f16f07a3a0a4ec06dfedba63cd0f79b998ac4509fabea3\" in 10.195s (10.195s including waiting). Image size: 456374430 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:24:21.874915628 +0000 UTC m=+184.433959509,LastTimestamp:2026-03-12 12:24:21.874915628 +0000 UTC m=+184.433959509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:24:21.956751 master-0 kubenswrapper[7320]: I0312 12:24:21.956697 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:22.063372 master-0 kubenswrapper[7320]: I0312 12:24:22.063213 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"a97fcd56-aa52-414a-b370-154c1b34c1ed\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " Mar 12 12:24:22.063372 master-0 kubenswrapper[7320]: I0312 12:24:22.063292 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"a97fcd56-aa52-414a-b370-154c1b34c1ed\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " Mar 12 12:24:22.063372 master-0 kubenswrapper[7320]: I0312 12:24:22.063376 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"a97fcd56-aa52-414a-b370-154c1b34c1ed\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " Mar 12 12:24:22.063758 master-0 kubenswrapper[7320]: I0312 12:24:22.063524 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a97fcd56-aa52-414a-b370-154c1b34c1ed" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:22.063758 master-0 kubenswrapper[7320]: I0312 12:24:22.063592 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "a97fcd56-aa52-414a-b370-154c1b34c1ed" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:22.066037 master-0 kubenswrapper[7320]: I0312 12:24:22.066002 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a97fcd56-aa52-414a-b370-154c1b34c1ed" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:24:22.164735 master-0 kubenswrapper[7320]: I0312 12:24:22.164586 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:22.164735 master-0 kubenswrapper[7320]: I0312 12:24:22.164622 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:22.164735 master-0 kubenswrapper[7320]: I0312 12:24:22.164634 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:22.324064 master-0 kubenswrapper[7320]: E0312 12:24:22.323965 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:22.324566 master-0 kubenswrapper[7320]: E0312 12:24:22.324520 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:22.325053 master-0 kubenswrapper[7320]: E0312 12:24:22.325009 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:22.325402 master-0 kubenswrapper[7320]: E0312 12:24:22.325367 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:22.325844 master-0 kubenswrapper[7320]: E0312 12:24:22.325808 7320 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:22.325844 master-0 kubenswrapper[7320]: I0312 12:24:22.325835 7320 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 12:24:22.326275 master-0 kubenswrapper[7320]: E0312 12:24:22.326242 7320 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 12 12:24:22.527302 master-0 kubenswrapper[7320]: E0312 12:24:22.527247 7320 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 12 12:24:22.892385 master-0 kubenswrapper[7320]: I0312 12:24:22.892285 7320 generic.go:334] "Generic (PLEG): container finished" podID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" containerID="85782e605fa997f78fa1b00da3fcf4b854ebddeada29c56b1d38c44587c26563" exitCode=0 Mar 12 12:24:22.892896 master-0 kubenswrapper[7320]: I0312 12:24:22.892366 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4","Type":"ContainerDied","Data":"85782e605fa997f78fa1b00da3fcf4b854ebddeada29c56b1d38c44587c26563"} Mar 12 12:24:22.893790 master-0 kubenswrapper[7320]: I0312 12:24:22.893767 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"a97fcd56-aa52-414a-b370-154c1b34c1ed","Type":"ContainerDied","Data":"3cb6ab6d4b334e440fd45f035b87035c3f267128290d2900f9474bca9932ecf6"} Mar 12 12:24:22.893859 master-0 kubenswrapper[7320]: I0312 12:24:22.893795 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb6ab6d4b334e440fd45f035b87035c3f267128290d2900f9474bca9932ecf6" Mar 12 12:24:22.893859 master-0 kubenswrapper[7320]: I0312 12:24:22.893838 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:22.929023 master-0 kubenswrapper[7320]: E0312 12:24:22.928966 7320 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 12 12:24:23.730671 master-0 kubenswrapper[7320]: E0312 12:24:23.730606 7320 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 12 12:24:23.979454 master-0 kubenswrapper[7320]: W0312 12:24:23.978801 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod161fce36d846c7ce98305d8ed6c23827.slice/crio-8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6 WatchSource:0}: Error finding container 8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6: Status 404 returned error can't find the container with id 8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6 Mar 12 12:24:23.984781 master-0 kubenswrapper[7320]: I0312 12:24:23.984725 7320 scope.go:117] "RemoveContainer" containerID="a7884f3a85cd7c647eb2f8e86a22aa409d3d4d1224200c8d3e0d66ec06385f1c" Mar 12 12:24:24.099290 master-0 kubenswrapper[7320]: I0312 12:24:24.099232 7320 scope.go:117] "RemoveContainer" containerID="b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3" Mar 12 12:24:24.099945 master-0 kubenswrapper[7320]: E0312 12:24:24.099900 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3\": container with ID starting with b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3 not found: ID does not exist" containerID="b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3" Mar 12 12:24:24.100027 master-0 kubenswrapper[7320]: I0312 12:24:24.099962 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3"} err="failed to get container status \"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3\": rpc error: code = NotFound desc = could not find container \"b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3\": container with ID starting with b74a138738198c275184b1e51c531106c7cb5a17599ddba44af60c50e77bc6e3 not found: ID does not exist" Mar 12 12:24:24.100027 master-0 kubenswrapper[7320]: I0312 12:24:24.099992 7320 scope.go:117] "RemoveContainer" containerID="96e48ce640071ce1e3c5822f8f356a843319ddfef771dec6cce93f02b2946dac" Mar 12 12:24:24.134586 master-0 kubenswrapper[7320]: W0312 12:24:24.134534 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df WatchSource:0}: Error finding container 648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df: Status 404 returned error can't find the container with id 648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df Mar 12 12:24:24.150835 master-0 kubenswrapper[7320]: I0312 12:24:24.150773 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:24:24.156313 master-0 kubenswrapper[7320]: I0312 12:24:24.156267 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:24.159553 master-0 kubenswrapper[7320]: W0312 12:24:24.159518 7320 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf417e14665db2ffffa887ce21c9ff0ed.slice/crio-52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be WatchSource:0}: Error finding container 52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be: Status 404 returned error can't find the container with id 52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295180 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295228 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295250 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295293 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295332 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295356 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295379 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295396 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295422 7320 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 12 12:24:24.295711 master-0 kubenswrapper[7320]: I0312 12:24:24.295700 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296018 master-0 kubenswrapper[7320]: I0312 12:24:24.295735 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock" (OuterVolumeSpecName: "var-lock") pod "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296018 master-0 kubenswrapper[7320]: I0312 12:24:24.295752 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296018 master-0 kubenswrapper[7320]: I0312 12:24:24.295766 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296018 master-0 kubenswrapper[7320]: I0312 12:24:24.295780 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config" (OuterVolumeSpecName: "config") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296426 master-0 kubenswrapper[7320]: I0312 12:24:24.296245 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets" (OuterVolumeSpecName: "secrets") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296426 master-0 kubenswrapper[7320]: I0312 12:24:24.296258 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs" (OuterVolumeSpecName: "logs") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.296426 master-0 kubenswrapper[7320]: I0312 12:24:24.296290 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:24.298411 master-0 kubenswrapper[7320]: I0312 12:24:24.298383 7320 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396554 7320 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396592 7320 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396610 7320 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396621 7320 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396634 7320 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396644 7320 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396655 7320 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396666 7320 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.396743 master-0 kubenswrapper[7320]: I0312 12:24:24.396676 7320 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:24.876745 master-0 kubenswrapper[7320]: I0312 12:24:24.876682 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.877721 master-0 kubenswrapper[7320]: I0312 12:24:24.877670 7320 status_manager.go:851] "Failed to get status for pod" podUID="f78c05e1499b533b83f091333d61f045" pod="kube-system/bootstrap-kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/kube-system/pods/bootstrap-kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.881756 master-0 kubenswrapper[7320]: I0312 12:24:24.881624 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.882708 master-0 kubenswrapper[7320]: I0312 12:24:24.882662 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.883081 master-0 kubenswrapper[7320]: I0312 12:24:24.883043 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.883851 master-0 kubenswrapper[7320]: I0312 12:24:24.883801 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.886666 master-0 kubenswrapper[7320]: I0312 12:24:24.885895 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.917800 master-0 kubenswrapper[7320]: I0312 12:24:24.917759 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"2e1c10ff585ac62d9bd97610178ea53b8be2cf0963171b0a4671fc1da4d09ac1"} Mar 12 12:24:24.917868 master-0 kubenswrapper[7320]: I0312 12:24:24.917804 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"cfafaefe1186e2aaf0be8643dd0cbdc3cd2e42fe49f94ac73da95ad15ec95688"} Mar 12 12:24:24.917868 master-0 kubenswrapper[7320]: I0312 12:24:24.917817 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6"} Mar 12 12:24:24.919186 master-0 kubenswrapper[7320]: I0312 12:24:24.919155 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" event={"ID":"f3b704e7-1291-4645-8a0d-2a937829d7ac","Type":"ContainerStarted","Data":"43680f7bbccbe0e894d252b9c7302ee1699f682897d030495b133d3d14f28d18"} Mar 12 12:24:24.920359 master-0 kubenswrapper[7320]: I0312 12:24:24.920311 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.920777 master-0 kubenswrapper[7320]: I0312 12:24:24.920744 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.921634 master-0 kubenswrapper[7320]: I0312 12:24:24.921596 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.923080 master-0 kubenswrapper[7320]: I0312 12:24:24.923045 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4","Type":"ContainerDied","Data":"d5a035c7049c39dfacabd51a96c5380843e394f28f3ad6d0e9bceda8fa427e90"} Mar 12 12:24:24.923080 master-0 kubenswrapper[7320]: I0312 12:24:24.923079 7320 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a035c7049c39dfacabd51a96c5380843e394f28f3ad6d0e9bceda8fa427e90" Mar 12 12:24:24.923157 master-0 kubenswrapper[7320]: I0312 12:24:24.923133 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:24.924170 master-0 kubenswrapper[7320]: I0312 12:24:24.924137 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.924656 master-0 kubenswrapper[7320]: I0312 12:24:24.924610 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.926791 master-0 kubenswrapper[7320]: I0312 12:24:24.926045 7320 generic.go:334] "Generic (PLEG): container finished" podID="68c57a64-f30c-4caf-89ef-08bd0d36833e" containerID="60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af" exitCode=0 Mar 12 12:24:24.926791 master-0 kubenswrapper[7320]: I0312 12:24:24.926110 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" event={"ID":"68c57a64-f30c-4caf-89ef-08bd0d36833e","Type":"ContainerDied","Data":"60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af"} Mar 12 12:24:24.926791 master-0 kubenswrapper[7320]: I0312 12:24:24.926445 7320 scope.go:117] "RemoveContainer" containerID="60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af" Mar 12 12:24:24.928304 master-0 kubenswrapper[7320]: I0312 12:24:24.928258 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.928926 master-0 kubenswrapper[7320]: I0312 12:24:24.928866 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.929440 master-0 kubenswrapper[7320]: I0312 12:24:24.929396 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.930018 master-0 kubenswrapper[7320]: I0312 12:24:24.929970 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.930383 master-0 kubenswrapper[7320]: I0312 12:24:24.930355 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/0.log" Mar 12 12:24:24.930442 master-0 kubenswrapper[7320]: I0312 12:24:24.930413 7320 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14" exitCode=1 Mar 12 12:24:24.930442 master-0 kubenswrapper[7320]: I0312 12:24:24.930433 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerStarted","Data":"bafec38e3bcd46daebdd5cacb6b0437746f99565b365c26311aa5bbeddf533c1"} Mar 12 12:24:24.930519 master-0 kubenswrapper[7320]: I0312 12:24:24.930457 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerDied","Data":"8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14"} Mar 12 12:24:24.930519 master-0 kubenswrapper[7320]: I0312 12:24:24.930410 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.930876 master-0 kubenswrapper[7320]: I0312 12:24:24.930839 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.930990 master-0 kubenswrapper[7320]: I0312 12:24:24.930964 7320 scope.go:117] "RemoveContainer" containerID="8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14" Mar 12 12:24:24.931457 master-0 kubenswrapper[7320]: I0312 12:24:24.931424 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.931835 master-0 kubenswrapper[7320]: I0312 12:24:24.931803 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.932138 master-0 kubenswrapper[7320]: I0312 12:24:24.932106 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.932666 master-0 kubenswrapper[7320]: I0312 12:24:24.932633 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" event={"ID":"02d5a507-4409-44b4-98bc-1751cdcc6c6a","Type":"ContainerStarted","Data":"45d7aa6f87a3ed85b2607860daa09cbc5d487a99741600d37622bd40f58f8239"} Mar 12 12:24:24.932713 master-0 kubenswrapper[7320]: I0312 12:24:24.932657 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.933190 master-0 kubenswrapper[7320]: I0312 12:24:24.933145 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.936106 master-0 kubenswrapper[7320]: I0312 12:24:24.936070 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.936759 master-0 kubenswrapper[7320]: I0312 12:24:24.936725 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.937301 master-0 kubenswrapper[7320]: I0312 12:24:24.937268 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.937886 master-0 kubenswrapper[7320]: I0312 12:24:24.937859 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.938289 master-0 kubenswrapper[7320]: I0312 12:24:24.938221 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.938818 master-0 kubenswrapper[7320]: I0312 12:24:24.938787 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.939238 master-0 kubenswrapper[7320]: I0312 12:24:24.939198 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.939607 master-0 kubenswrapper[7320]: I0312 12:24:24.939579 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.940155 master-0 kubenswrapper[7320]: I0312 12:24:24.940086 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.940465 master-0 kubenswrapper[7320]: I0312 12:24:24.940421 7320 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12" exitCode=0 Mar 12 12:24:24.940625 master-0 kubenswrapper[7320]: I0312 12:24:24.940606 7320 scope.go:117] "RemoveContainer" containerID="f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f" Mar 12 12:24:24.940760 master-0 kubenswrapper[7320]: I0312 12:24:24.940737 7320 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 12 12:24:24.941157 master-0 kubenswrapper[7320]: I0312 12:24:24.940993 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.942179 master-0 kubenswrapper[7320]: I0312 12:24:24.942148 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.942935 master-0 kubenswrapper[7320]: I0312 12:24:24.942844 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.943709 master-0 kubenswrapper[7320]: I0312 12:24:24.943673 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.944584 master-0 kubenswrapper[7320]: I0312 12:24:24.944554 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.945711 master-0 kubenswrapper[7320]: I0312 12:24:24.945672 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.946132 master-0 kubenswrapper[7320]: I0312 12:24:24.946101 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.946759 master-0 kubenswrapper[7320]: I0312 12:24:24.946712 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.947177 master-0 kubenswrapper[7320]: I0312 12:24:24.947130 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.962512 master-0 kubenswrapper[7320]: I0312 12:24:24.962120 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.963561 master-0 kubenswrapper[7320]: I0312 12:24:24.962879 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.963561 master-0 kubenswrapper[7320]: I0312 12:24:24.963412 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.964135 master-0 kubenswrapper[7320]: I0312 12:24:24.964092 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.964329 master-0 kubenswrapper[7320]: I0312 12:24:24.964296 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" event={"ID":"2bab9dba-235f-467c-9224-634cca9acbd2","Type":"ContainerStarted","Data":"2ed14edf7421ab677fa9c2cbcfd66d8fefd23747e7948839f3fcbffbd630fad1"} Mar 12 12:24:24.967137 master-0 kubenswrapper[7320]: I0312 12:24:24.966849 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.970604 master-0 kubenswrapper[7320]: I0312 12:24:24.968074 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.970604 master-0 kubenswrapper[7320]: I0312 12:24:24.968663 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.973669 master-0 kubenswrapper[7320]: I0312 12:24:24.971704 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.973669 master-0 kubenswrapper[7320]: I0312 12:24:24.972211 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.973669 master-0 kubenswrapper[7320]: I0312 12:24:24.972510 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.976079 master-0 kubenswrapper[7320]: I0312 12:24:24.976039 7320 status_manager.go:851] "Failed to get status for pod" podUID="2bab9dba-235f-467c-9224-634cca9acbd2" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-84bf6db4f9-nq8zw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.976145 master-0 kubenswrapper[7320]: I0312 12:24:24.976082 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"f0f3c0b6faeeb349351f1b371244c103cf179f81f4ae7b1577ee82387a636818"} Mar 12 12:24:24.976226 master-0 kubenswrapper[7320]: I0312 12:24:24.976061 7320 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f0f3c0b6faeeb349351f1b371244c103cf179f81f4ae7b1577ee82387a636818" exitCode=0 Mar 12 12:24:24.976375 master-0 kubenswrapper[7320]: I0312 12:24:24.976356 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df"} Mar 12 12:24:24.976916 master-0 kubenswrapper[7320]: I0312 12:24:24.976883 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.977418 master-0 kubenswrapper[7320]: I0312 12:24:24.977390 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.977913 master-0 kubenswrapper[7320]: I0312 12:24:24.977874 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.978284 master-0 kubenswrapper[7320]: I0312 12:24:24.978256 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.978647 master-0 kubenswrapper[7320]: I0312 12:24:24.978621 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.978976 master-0 kubenswrapper[7320]: I0312 12:24:24.978948 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.979383 master-0 kubenswrapper[7320]: I0312 12:24:24.979353 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.979917 master-0 kubenswrapper[7320]: I0312 12:24:24.979884 7320 status_manager.go:851] "Failed to get status for pod" podUID="2bab9dba-235f-467c-9224-634cca9acbd2" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-84bf6db4f9-nq8zw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.980528 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.981742 7320 scope.go:117] "RemoveContainer" containerID="e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.981972 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.983155 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.984843 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.985347 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.987095 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.987505 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.988798 master-0 kubenswrapper[7320]: I0312 12:24:24.987857 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" event={"ID":"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc","Type":"ContainerStarted","Data":"17336bfb565956ddc39a230d0296c19707c7961d1004d9cb08cfd084b37df4ac"} Mar 12 12:24:24.992284 master-0 kubenswrapper[7320]: I0312 12:24:24.992090 7320 status_manager.go:851] "Failed to get status for pod" podUID="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/pods/machine-approver-754bdc9f9d-2hk7d\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.992577 master-0 kubenswrapper[7320]: I0312 12:24:24.992552 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.992973 master-0 kubenswrapper[7320]: I0312 12:24:24.992948 7320 status_manager.go:851] "Failed to get status for pod" podUID="2bab9dba-235f-467c-9224-634cca9acbd2" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-84bf6db4f9-nq8zw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.993359 master-0 kubenswrapper[7320]: I0312 12:24:24.993327 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:24.993686 master-0 kubenswrapper[7320]: I0312 12:24:24.993664 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" event={"ID":"f19c3c89-8d32-4394-bd86-e5ef7734c42b","Type":"ContainerStarted","Data":"c43a87c077b886c9b5abebe477ab0400b4e16ffec4c2791c42b101b7e529c934"} Mar 12 12:24:24.993734 master-0 kubenswrapper[7320]: I0312 12:24:24.993694 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" event={"ID":"f19c3c89-8d32-4394-bd86-e5ef7734c42b","Type":"ContainerStarted","Data":"f82b0c6e15132dfa103b323d69456f6cab9f9cc9f20a58ef69c1f5e37e2daab8"} Mar 12 12:24:24.994445 master-0 kubenswrapper[7320]: I0312 12:24:24.994411 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.001698 master-0 kubenswrapper[7320]: I0312 12:24:25.001663 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"faf97dd3763176aab30d820af53bb9c984317f76cbc30d430d1b3c10226441f2"} Mar 12 12:24:25.001761 master-0 kubenswrapper[7320]: I0312 12:24:25.001707 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be"} Mar 12 12:24:25.002247 master-0 kubenswrapper[7320]: I0312 12:24:25.002189 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.003711 master-0 kubenswrapper[7320]: I0312 12:24:25.003663 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.004932 master-0 kubenswrapper[7320]: I0312 12:24:25.004842 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.005413 master-0 kubenswrapper[7320]: I0312 12:24:25.005217 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.006034 master-0 kubenswrapper[7320]: I0312 12:24:25.006002 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.010557 master-0 kubenswrapper[7320]: I0312 12:24:25.010085 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.011572 master-0 kubenswrapper[7320]: I0312 12:24:25.011200 7320 status_manager.go:851] "Failed to get status for pod" podUID="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/pods/machine-approver-754bdc9f9d-2hk7d\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.012460 master-0 kubenswrapper[7320]: I0312 12:24:25.012381 7320 status_manager.go:851] "Failed to get status for pod" podUID="f19c3c89-8d32-4394-bd86-e5ef7734c42b" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-664cb58b85-k9t2m\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.012911 master-0 kubenswrapper[7320]: I0312 12:24:25.012878 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.013925 master-0 kubenswrapper[7320]: I0312 12:24:25.013897 7320 status_manager.go:851] "Failed to get status for pod" podUID="2bab9dba-235f-467c-9224-634cca9acbd2" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-84bf6db4f9-nq8zw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.014487 master-0 kubenswrapper[7320]: I0312 12:24:25.014386 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.015807 master-0 kubenswrapper[7320]: I0312 12:24:25.015786 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.017120 master-0 kubenswrapper[7320]: I0312 12:24:25.017091 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.018918 master-0 kubenswrapper[7320]: I0312 12:24:25.018707 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.021620 master-0 kubenswrapper[7320]: I0312 12:24:25.020595 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.021620 master-0 kubenswrapper[7320]: I0312 12:24:25.021133 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.026319 master-0 kubenswrapper[7320]: I0312 12:24:25.025885 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/0.log" Mar 12 12:24:25.027320 master-0 kubenswrapper[7320]: I0312 12:24:25.026466 7320 generic.go:334] "Generic (PLEG): container finished" podID="632651f7-6641-49d8-9c48-7f6ea5846538" containerID="a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb" exitCode=255 Mar 12 12:24:25.027320 master-0 kubenswrapper[7320]: I0312 12:24:25.026542 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerDied","Data":"a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb"} Mar 12 12:24:25.027320 master-0 kubenswrapper[7320]: I0312 12:24:25.026932 7320 scope.go:117] "RemoveContainer" containerID="a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb" Mar 12 12:24:25.027830 master-0 kubenswrapper[7320]: I0312 12:24:25.027665 7320 status_manager.go:851] "Failed to get status for pod" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" pod="openshift-kube-apiserver/installer-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.028099 master-0 kubenswrapper[7320]: I0312 12:24:25.028053 7320 status_manager.go:851] "Failed to get status for pod" podUID="2bab9dba-235f-467c-9224-634cca9acbd2" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/machine-api-operator-84bf6db4f9-nq8zw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.028699 master-0 kubenswrapper[7320]: I0312 12:24:25.028606 7320 status_manager.go:851] "Failed to get status for pod" podUID="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-machine-approver/pods/machine-approver-754bdc9f9d-2hk7d\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.029719 master-0 kubenswrapper[7320]: I0312 12:24:25.029674 7320 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" event={"ID":"81cb0504-9455-4398-aed1-5cc6790f292e","Type":"ContainerStarted","Data":"ec76313e8f3101ce66ed46fb96c7f37aca18f8240494bdcfc27692fc83f7724a"} Mar 12 12:24:25.029981 master-0 kubenswrapper[7320]: I0312 12:24:25.029849 7320 status_manager.go:851] "Failed to get status for pod" podUID="f19c3c89-8d32-4394-bd86-e5ef7734c42b" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/pods/cluster-samples-operator-664cb58b85-k9t2m\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.032074 master-0 kubenswrapper[7320]: I0312 12:24:25.032025 7320 status_manager.go:851] "Failed to get status for pod" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-insights/pods/insights-operator-8f89dfddd-4vzl8\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.032512 master-0 kubenswrapper[7320]: I0312 12:24:25.032467 7320 status_manager.go:851] "Failed to get status for pod" podUID="632651f7-6641-49d8-9c48-7f6ea5846538" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-autoscaler-operator-69576476f7-ph7gk\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.037162 master-0 kubenswrapper[7320]: I0312 12:24:25.032909 7320 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.037444 master-0 kubenswrapper[7320]: I0312 12:24:25.037252 7320 status_manager.go:851] "Failed to get status for pod" podUID="f3b704e7-1291-4645-8a0d-2a937829d7ac" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-storage-operator/pods/cluster-storage-operator-6fbfc8dc8f-5q4fw\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.037894 master-0 kubenswrapper[7320]: I0312 12:24:25.037852 7320 status_manager.go:851] "Failed to get status for pod" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" pod="openshift-kube-controller-manager/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.038416 master-0 kubenswrapper[7320]: I0312 12:24:25.038333 7320 status_manager.go:851] "Failed to get status for pod" podUID="02d5a507-4409-44b4-98bc-1751cdcc6c6a" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-credential-operator/pods/cloud-credential-operator-55d85b7b47-gz7ll\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.039011 master-0 kubenswrapper[7320]: I0312 12:24:25.038890 7320 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.041690 master-0 kubenswrapper[7320]: I0312 12:24:25.039328 7320 status_manager.go:851] "Failed to get status for pod" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/cluster-baremetal-operator-5cdb4c5598-pb97p\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:24:25.071787 master-0 kubenswrapper[7320]: I0312 12:24:25.071746 7320 scope.go:117] "RemoveContainer" containerID="f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03" Mar 12 12:24:25.140168 master-0 kubenswrapper[7320]: I0312 12:24:25.140025 7320 scope.go:117] "RemoveContainer" containerID="f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f" Mar 12 12:24:25.142310 master-0 kubenswrapper[7320]: E0312 12:24:25.142244 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f\": container with ID starting with f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f not found: ID does not exist" containerID="f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f" Mar 12 12:24:25.142529 master-0 kubenswrapper[7320]: I0312 12:24:25.142318 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f"} err="failed to get container status \"f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f\": rpc error: code = NotFound desc = could not find container \"f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f\": container with ID starting with f5ba430b2c0f7c0c75b3fe2133778e39942ec08ccc1a758dc92871099c861b5f not found: ID does not exist" Mar 12 12:24:25.142529 master-0 kubenswrapper[7320]: I0312 12:24:25.142351 7320 scope.go:117] "RemoveContainer" containerID="e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12" Mar 12 12:24:25.143729 master-0 kubenswrapper[7320]: E0312 12:24:25.143669 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12\": container with ID starting with e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12 not found: ID does not exist" containerID="e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12" Mar 12 12:24:25.143802 master-0 kubenswrapper[7320]: I0312 12:24:25.143752 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12"} err="failed to get container status \"e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12\": rpc error: code = NotFound desc = could not find container \"e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12\": container with ID starting with e3fd3c023a00bc2bced6fd018a80ae919e8800583be4cc7c78ade98bc186ee12 not found: ID does not exist" Mar 12 12:24:25.143802 master-0 kubenswrapper[7320]: I0312 12:24:25.143789 7320 scope.go:117] "RemoveContainer" containerID="f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03" Mar 12 12:24:25.145424 master-0 kubenswrapper[7320]: E0312 12:24:25.145339 7320 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03\": container with ID starting with f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03 not found: ID does not exist" containerID="f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03" Mar 12 12:24:25.145424 master-0 kubenswrapper[7320]: I0312 12:24:25.145377 7320 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03"} err="failed to get container status \"f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03\": rpc error: code = NotFound desc = could not find container \"f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03\": container with ID starting with f695008c85296cee1b0541076cdf8986c14ea55b2a0a92737c8037e5e897fb03 not found: ID does not exist" Mar 12 12:24:25.332028 master-0 kubenswrapper[7320]: E0312 12:24:25.331984 7320 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 12 12:24:25.764274 master-0 kubenswrapper[7320]: I0312 12:24:25.763226 7320 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 12 12:24:25.764274 master-0 kubenswrapper[7320]: I0312 12:24:25.763653 7320 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 12 12:24:26.046100 master-0 kubenswrapper[7320]: I0312 12:24:26.046030 7320 generic.go:334] "Generic (PLEG): container finished" podID="68c57a64-f30c-4caf-89ef-08bd0d36833e" containerID="66efbdc582547fe0aac5943aa60889d7bd9e3cd56c005e2ffa377552f8953df8" exitCode=0 Mar 12 12:24:26.046547 master-0 kubenswrapper[7320]: I0312 12:24:26.046453 7320 scope.go:117] "RemoveContainer" containerID="66efbdc582547fe0aac5943aa60889d7bd9e3cd56c005e2ffa377552f8953df8" Mar 12 12:24:26.046692 master-0 kubenswrapper[7320]: E0312 12:24:26.046639 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"insights-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=insights-operator pod=insights-operator-8f89dfddd-4vzl8_openshift-insights(68c57a64-f30c-4caf-89ef-08bd0d36833e)\"" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" podUID="68c57a64-f30c-4caf-89ef-08bd0d36833e" Mar 12 12:24:26.050086 master-0 kubenswrapper[7320]: I0312 12:24:26.050000 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/1.log" Mar 12 12:24:26.051048 master-0 kubenswrapper[7320]: I0312 12:24:26.051017 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/0.log" Mar 12 12:24:26.051094 master-0 kubenswrapper[7320]: I0312 12:24:26.051059 7320 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="06f4c15730d5d23bfb91ec1bbc7bab14f9a3a3ae32c22b935d487a1f88576da3" exitCode=1 Mar 12 12:24:26.051567 master-0 kubenswrapper[7320]: I0312 12:24:26.051542 7320 scope.go:117] "RemoveContainer" containerID="06f4c15730d5d23bfb91ec1bbc7bab14f9a3a3ae32c22b935d487a1f88576da3" Mar 12 12:24:26.051769 master-0 kubenswrapper[7320]: E0312 12:24:26.051742 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:24:26.072030 master-0 kubenswrapper[7320]: I0312 12:24:26.071994 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/0.log" Mar 12 12:24:26.072409 master-0 kubenswrapper[7320]: I0312 12:24:26.072381 7320 generic.go:334] "Generic (PLEG): container finished" podID="81cb0504-9455-4398-aed1-5cc6790f292e" containerID="db3a8ef068f83e5dd24c99272c185086b8f1f58c8b82fffeba41265fdc76efe5" exitCode=1 Mar 12 12:24:26.072888 master-0 kubenswrapper[7320]: I0312 12:24:26.072869 7320 scope.go:117] "RemoveContainer" containerID="db3a8ef068f83e5dd24c99272c185086b8f1f58c8b82fffeba41265fdc76efe5" Mar 12 12:24:26.075992 master-0 kubenswrapper[7320]: I0312 12:24:26.075938 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/1.log" Mar 12 12:24:26.076660 master-0 kubenswrapper[7320]: I0312 12:24:26.076646 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/0.log" Mar 12 12:24:26.077175 master-0 kubenswrapper[7320]: I0312 12:24:26.077158 7320 generic.go:334] "Generic (PLEG): container finished" podID="632651f7-6641-49d8-9c48-7f6ea5846538" containerID="c119461954016ba56cd4650cbf6e8e2b03da1364b0a52ff3ba4437048b8fac29" exitCode=255 Mar 12 12:24:26.077762 master-0 kubenswrapper[7320]: I0312 12:24:26.077750 7320 scope.go:117] "RemoveContainer" containerID="c119461954016ba56cd4650cbf6e8e2b03da1364b0a52ff3ba4437048b8fac29" Mar 12 12:24:26.078030 master-0 kubenswrapper[7320]: E0312 12:24:26.078015 7320 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-ph7gk_openshift-machine-api(632651f7-6641-49d8-9c48-7f6ea5846538)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" podUID="632651f7-6641-49d8-9c48-7f6ea5846538" Mar 12 12:24:27.089703 master-0 kubenswrapper[7320]: I0312 12:24:27.089660 7320 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/0.log" Mar 12 12:24:27.716114 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 12 12:24:27.735426 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 12 12:24:27.735722 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 12 12:24:27.736868 master-0 systemd[1]: kubelet.service: Consumed 23.566s CPU time. Mar 12 12:24:27.750899 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 12 12:24:27.855743 master-0 kubenswrapper[13984]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:24:27.855743 master-0 kubenswrapper[13984]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 12 12:24:27.855743 master-0 kubenswrapper[13984]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:24:27.855743 master-0 kubenswrapper[13984]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:24:27.855743 master-0 kubenswrapper[13984]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 12 12:24:27.855743 master-0 kubenswrapper[13984]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 12 12:24:27.856973 master-0 kubenswrapper[13984]: I0312 12:24:27.855838 13984 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859904 13984 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859922 13984 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859929 13984 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859934 13984 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859941 13984 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859947 13984 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:24:27.859943 master-0 kubenswrapper[13984]: W0312 12:24:27.859952 13984 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.859959 13984 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.859983 13984 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.859987 13984 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.859990 13984 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.859994 13984 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.859997 13984 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860001 13984 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860005 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860008 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860012 13984 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860016 13984 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860021 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860027 13984 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860032 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860037 13984 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860062 13984 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860067 13984 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860070 13984 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860074 13984 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:24:27.860358 master-0 kubenswrapper[13984]: W0312 12:24:27.860077 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860081 13984 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860084 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860088 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860091 13984 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860094 13984 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860098 13984 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860102 13984 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860106 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860111 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860115 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860121 13984 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860147 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860152 13984 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860158 13984 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860162 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860166 13984 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860169 13984 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860172 13984 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860177 13984 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:24:27.861735 master-0 kubenswrapper[13984]: W0312 12:24:27.860182 13984 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860186 13984 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860191 13984 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860194 13984 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860198 13984 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860222 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860227 13984 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860232 13984 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860236 13984 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860240 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860243 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860247 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860251 13984 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860254 13984 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860257 13984 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860261 13984 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860264 13984 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860268 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860271 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:24:27.863037 master-0 kubenswrapper[13984]: W0312 12:24:27.860275 13984 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: W0312 12:24:27.860278 13984 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: W0312 12:24:27.860300 13984 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: W0312 12:24:27.860308 13984 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: W0312 12:24:27.860312 13984 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: W0312 12:24:27.860316 13984 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: W0312 12:24:27.860319 13984 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860431 13984 flags.go:64] FLAG: --address="0.0.0.0" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860464 13984 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860503 13984 flags.go:64] FLAG: --anonymous-auth="true" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860512 13984 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860520 13984 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860525 13984 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860530 13984 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860535 13984 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860540 13984 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860544 13984 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860549 13984 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860553 13984 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860558 13984 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860582 13984 flags.go:64] FLAG: --cgroup-root="" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860587 13984 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860592 13984 flags.go:64] FLAG: --client-ca-file="" Mar 12 12:24:27.864514 master-0 kubenswrapper[13984]: I0312 12:24:27.860596 13984 flags.go:64] FLAG: --cloud-config="" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860599 13984 flags.go:64] FLAG: --cloud-provider="" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860604 13984 flags.go:64] FLAG: --cluster-dns="[]" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860610 13984 flags.go:64] FLAG: --cluster-domain="" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860614 13984 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860618 13984 flags.go:64] FLAG: --config-dir="" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860622 13984 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860627 13984 flags.go:64] FLAG: --container-log-max-files="5" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860632 13984 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860636 13984 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860641 13984 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860664 13984 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860669 13984 flags.go:64] FLAG: --contention-profiling="false" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860676 13984 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860680 13984 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860684 13984 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860688 13984 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860694 13984 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860698 13984 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860703 13984 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860707 13984 flags.go:64] FLAG: --enable-load-reader="false" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860711 13984 flags.go:64] FLAG: --enable-server="true" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860715 13984 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860722 13984 flags.go:64] FLAG: --event-burst="100" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860727 13984 flags.go:64] FLAG: --event-qps="50" Mar 12 12:24:27.866004 master-0 kubenswrapper[13984]: I0312 12:24:27.860731 13984 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860735 13984 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860739 13984 flags.go:64] FLAG: --eviction-hard="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860745 13984 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860749 13984 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860754 13984 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860758 13984 flags.go:64] FLAG: --eviction-soft="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860762 13984 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860766 13984 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860770 13984 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860774 13984 flags.go:64] FLAG: --experimental-mounter-path="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860778 13984 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860782 13984 flags.go:64] FLAG: --fail-swap-on="true" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860787 13984 flags.go:64] FLAG: --feature-gates="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860792 13984 flags.go:64] FLAG: --file-check-frequency="20s" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860797 13984 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860803 13984 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860808 13984 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860813 13984 flags.go:64] FLAG: --healthz-port="10248" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860818 13984 flags.go:64] FLAG: --help="false" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860824 13984 flags.go:64] FLAG: --hostname-override="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860829 13984 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860835 13984 flags.go:64] FLAG: --http-check-frequency="20s" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860840 13984 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860845 13984 flags.go:64] FLAG: --image-credential-provider-config="" Mar 12 12:24:27.867668 master-0 kubenswrapper[13984]: I0312 12:24:27.860850 13984 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860877 13984 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860886 13984 flags.go:64] FLAG: --image-service-endpoint="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860892 13984 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860897 13984 flags.go:64] FLAG: --kube-api-burst="100" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860903 13984 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860909 13984 flags.go:64] FLAG: --kube-api-qps="50" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860915 13984 flags.go:64] FLAG: --kube-reserved="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860920 13984 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860926 13984 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860931 13984 flags.go:64] FLAG: --kubelet-cgroups="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860961 13984 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860967 13984 flags.go:64] FLAG: --lock-file="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860972 13984 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860976 13984 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860981 13984 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860988 13984 flags.go:64] FLAG: --log-json-split-stream="false" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860993 13984 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.860997 13984 flags.go:64] FLAG: --log-text-split-stream="false" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861001 13984 flags.go:64] FLAG: --logging-format="text" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861005 13984 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861010 13984 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861014 13984 flags.go:64] FLAG: --manifest-url="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861018 13984 flags.go:64] FLAG: --manifest-url-header="" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861047 13984 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 12 12:24:27.868953 master-0 kubenswrapper[13984]: I0312 12:24:27.861053 13984 flags.go:64] FLAG: --max-open-files="1000000" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861060 13984 flags.go:64] FLAG: --max-pods="110" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861065 13984 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861071 13984 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861076 13984 flags.go:64] FLAG: --memory-manager-policy="None" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861081 13984 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861087 13984 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861092 13984 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861097 13984 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861136 13984 flags.go:64] FLAG: --node-status-max-images="50" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861142 13984 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861147 13984 flags.go:64] FLAG: --oom-score-adj="-999" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861152 13984 flags.go:64] FLAG: --pod-cidr="" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861157 13984 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861205 13984 flags.go:64] FLAG: --pod-manifest-path="" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861210 13984 flags.go:64] FLAG: --pod-max-pids="-1" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861214 13984 flags.go:64] FLAG: --pods-per-core="0" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861218 13984 flags.go:64] FLAG: --port="10250" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861222 13984 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861227 13984 flags.go:64] FLAG: --provider-id="" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861231 13984 flags.go:64] FLAG: --qos-reserved="" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861236 13984 flags.go:64] FLAG: --read-only-port="10255" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861240 13984 flags.go:64] FLAG: --register-node="true" Mar 12 12:24:27.870733 master-0 kubenswrapper[13984]: I0312 12:24:27.861244 13984 flags.go:64] FLAG: --register-schedulable="true" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861248 13984 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861256 13984 flags.go:64] FLAG: --registry-burst="10" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861278 13984 flags.go:64] FLAG: --registry-qps="5" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861284 13984 flags.go:64] FLAG: --reserved-cpus="" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861288 13984 flags.go:64] FLAG: --reserved-memory="" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861294 13984 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861298 13984 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861302 13984 flags.go:64] FLAG: --rotate-certificates="false" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861307 13984 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861311 13984 flags.go:64] FLAG: --runonce="false" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861315 13984 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861320 13984 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861324 13984 flags.go:64] FLAG: --seccomp-default="false" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861328 13984 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861332 13984 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861337 13984 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861359 13984 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861364 13984 flags.go:64] FLAG: --storage-driver-password="root" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861368 13984 flags.go:64] FLAG: --storage-driver-secure="false" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861372 13984 flags.go:64] FLAG: --storage-driver-table="stats" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861376 13984 flags.go:64] FLAG: --storage-driver-user="root" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861381 13984 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861387 13984 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861392 13984 flags.go:64] FLAG: --system-cgroups="" Mar 12 12:24:27.872164 master-0 kubenswrapper[13984]: I0312 12:24:27.861396 13984 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861403 13984 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861407 13984 flags.go:64] FLAG: --tls-cert-file="" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861411 13984 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861417 13984 flags.go:64] FLAG: --tls-min-version="" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861438 13984 flags.go:64] FLAG: --tls-private-key-file="" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861443 13984 flags.go:64] FLAG: --topology-manager-policy="none" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861447 13984 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861451 13984 flags.go:64] FLAG: --topology-manager-scope="container" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861456 13984 flags.go:64] FLAG: --v="2" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861463 13984 flags.go:64] FLAG: --version="false" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861468 13984 flags.go:64] FLAG: --vmodule="" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861506 13984 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: I0312 12:24:27.861511 13984 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861631 13984 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861638 13984 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861660 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861664 13984 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861667 13984 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861673 13984 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861677 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861680 13984 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861684 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:24:27.872904 master-0 kubenswrapper[13984]: W0312 12:24:27.861689 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861694 13984 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861698 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861701 13984 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861705 13984 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861709 13984 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861712 13984 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861718 13984 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861739 13984 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861743 13984 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861747 13984 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861750 13984 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861754 13984 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861757 13984 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861761 13984 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861765 13984 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861768 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861772 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861775 13984 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861779 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861783 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:24:27.873871 master-0 kubenswrapper[13984]: W0312 12:24:27.861786 13984 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861789 13984 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861793 13984 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861797 13984 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861817 13984 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861821 13984 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861824 13984 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861828 13984 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861831 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861835 13984 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861838 13984 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861843 13984 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861848 13984 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861852 13984 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861857 13984 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861863 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861867 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861870 13984 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861876 13984 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:24:27.874910 master-0 kubenswrapper[13984]: W0312 12:24:27.861897 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861903 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861906 13984 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861910 13984 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861914 13984 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861919 13984 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861923 13984 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861927 13984 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861931 13984 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861935 13984 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861939 13984 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861943 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861947 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861952 13984 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861957 13984 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861961 13984 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861965 13984 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861970 13984 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861974 13984 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:24:27.875831 master-0 kubenswrapper[13984]: W0312 12:24:27.861980 13984 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.861987 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.861993 13984 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.861998 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: I0312 12:24:27.862006 13984 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: I0312 12:24:27.869151 13984 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: I0312 12:24:27.869181 13984 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869273 13984 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869284 13984 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869289 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869294 13984 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869299 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869303 13984 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869307 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:24:27.876957 master-0 kubenswrapper[13984]: W0312 12:24:27.869311 13984 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869316 13984 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869322 13984 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869326 13984 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869330 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869334 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869338 13984 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869343 13984 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869348 13984 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869352 13984 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869356 13984 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869360 13984 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869366 13984 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869371 13984 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869376 13984 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869380 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869384 13984 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869388 13984 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869392 13984 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869397 13984 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:24:27.877331 master-0 kubenswrapper[13984]: W0312 12:24:27.869402 13984 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869406 13984 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869412 13984 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869417 13984 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869421 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869426 13984 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869430 13984 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869434 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869438 13984 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869442 13984 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869446 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869450 13984 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869455 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869459 13984 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869463 13984 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869467 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869471 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869492 13984 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869497 13984 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869501 13984 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:24:27.878715 master-0 kubenswrapper[13984]: W0312 12:24:27.869505 13984 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869509 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869513 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869517 13984 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869521 13984 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869525 13984 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869530 13984 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869535 13984 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869539 13984 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869543 13984 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869547 13984 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869551 13984 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869556 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869560 13984 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869564 13984 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869570 13984 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869576 13984 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869584 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869590 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:24:27.879499 master-0 kubenswrapper[13984]: W0312 12:24:27.869595 13984 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869599 13984 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869699 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869705 13984 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869710 13984 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869717 13984 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: I0312 12:24:27.869725 13984 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869903 13984 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869913 13984 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869918 13984 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869923 13984 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869927 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869931 13984 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869936 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 12 12:24:27.880159 master-0 kubenswrapper[13984]: W0312 12:24:27.869943 13984 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869948 13984 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869953 13984 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869957 13984 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869962 13984 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869966 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869971 13984 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869975 13984 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869980 13984 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869984 13984 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869989 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869993 13984 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.869998 13984 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870002 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870006 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870011 13984 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870017 13984 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870023 13984 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870030 13984 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 12 12:24:27.880690 master-0 kubenswrapper[13984]: W0312 12:24:27.870035 13984 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870039 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870044 13984 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870049 13984 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870054 13984 feature_gate.go:330] unrecognized feature gate: Example Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870058 13984 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870063 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870068 13984 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870116 13984 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870122 13984 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870126 13984 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870131 13984 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870136 13984 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870142 13984 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870146 13984 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870151 13984 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870156 13984 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870161 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870165 13984 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870170 13984 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 12 12:24:27.881805 master-0 kubenswrapper[13984]: W0312 12:24:27.870174 13984 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870178 13984 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870182 13984 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870187 13984 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870191 13984 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870195 13984 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870199 13984 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870204 13984 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870209 13984 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870213 13984 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870220 13984 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870225 13984 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870230 13984 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870235 13984 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870240 13984 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870245 13984 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870250 13984 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870255 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870259 13984 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 12 12:24:27.882791 master-0 kubenswrapper[13984]: W0312 12:24:27.870297 13984 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: W0312 12:24:27.870302 13984 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: W0312 12:24:27.870306 13984 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: W0312 12:24:27.870311 13984 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: W0312 12:24:27.870315 13984 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: W0312 12:24:27.870319 13984 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: W0312 12:24:27.870324 13984 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.870331 13984 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.870582 13984 server.go:940] "Client rotation is on, will bootstrap in background" Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.872619 13984 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.872717 13984 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.873340 13984 server.go:997] "Starting client certificate rotation" Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.873351 13984 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 12 12:24:27.883893 master-0 kubenswrapper[13984]: I0312 12:24:27.873530 13984 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 05:14:39.914016085 +0000 UTC Mar 12 12:24:27.884267 master-0 kubenswrapper[13984]: I0312 12:24:27.873606 13984 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 16h50m12.040412639s for next certificate rotation Mar 12 12:24:27.884267 master-0 kubenswrapper[13984]: I0312 12:24:27.873906 13984 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:24:27.884267 master-0 kubenswrapper[13984]: I0312 12:24:27.875494 13984 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:24:27.884267 master-0 kubenswrapper[13984]: I0312 12:24:27.877795 13984 log.go:25] "Validated CRI v1 runtime API" Mar 12 12:24:27.884267 master-0 kubenswrapper[13984]: I0312 12:24:27.881084 13984 log.go:25] "Validated CRI v1 image API" Mar 12 12:24:27.884267 master-0 kubenswrapper[13984]: I0312 12:24:27.881913 13984 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 12 12:24:27.890047 master-0 kubenswrapper[13984]: I0312 12:24:27.889998 13984 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4 f1cf7764-854b-4c2c-9df4-b92427278cd1:/dev/vda3] Mar 12 12:24:27.890902 master-0 kubenswrapper[13984]: I0312 12:24:27.890033 13984 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7/userdata/shm major:0 minor:513 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a/userdata/shm major:0 minor:134 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1faac8f245695a0b4273e53303a303c00cf33c7f391f25e9c09fc9c6b457b1b5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1faac8f245695a0b4273e53303a303c00cf33c7f391f25e9c09fc9c6b457b1b5/userdata/shm major:0 minor:304 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f/userdata/shm major:0 minor:267 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2ada9e185ebcd7796052d553750f50a0d8e59b4d4c070029c61ef23cd75f5a22/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2ada9e185ebcd7796052d553750f50a0d8e59b4d4c070029c61ef23cd75f5a22/userdata/shm major:0 minor:808 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86/userdata/shm major:0 minor:663 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba/userdata/shm major:0 minor:257 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457/userdata/shm major:0 minor:473 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1/userdata/shm major:0 minor:273 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/43388f3fc4030bd546e0a3b05c45ff8414138bbab582c7b9e9531efe462ae9bb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/43388f3fc4030bd546e0a3b05c45ff8414138bbab582c7b9e9531efe462ae9bb/userdata/shm major:0 minor:309 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/437f19dee9aaecaf0da0f9a4e2b870d07ad9f109bec28fd8d9b704649289798a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/437f19dee9aaecaf0da0f9a4e2b870d07ad9f109bec28fd8d9b704649289798a/userdata/shm major:0 minor:802 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4/userdata/shm major:0 minor:805 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/48802dfcb8b0bd5f567eee89b3f4b5a9769fbf82dea70b650fc097ec0ea21366/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/48802dfcb8b0bd5f567eee89b3f4b5a9769fbf82dea70b650fc097ec0ea21366/userdata/shm major:0 minor:800 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4e68f4bc560c5df24d0c2f98d45fcf56d54f2e3305701c6372ad15ea940098c4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4e68f4bc560c5df24d0c2f98d45fcf56d54f2e3305701c6372ad15ea940098c4/userdata/shm major:0 minor:489 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8/userdata/shm major:0 minor:632 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be/userdata/shm major:0 minor:52 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1/userdata/shm major:0 minor:484 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81/userdata/shm major:0 minor:103 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df/userdata/shm major:0 minor:61 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/64f1dc2d81d717bf0671e44d8f5d716029eb32e5b7d761c1919a015da1e533b6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/64f1dc2d81d717bf0671e44d8f5d716029eb32e5b7d761c1919a015da1e533b6/userdata/shm major:0 minor:371 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc/userdata/shm major:0 minor:274 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/700144f71aaf4edd85bba4cceda2dd6a1013711fa671c636535cf91528c721c3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/700144f71aaf4edd85bba4cceda2dd6a1013711fa671c636535cf91528c721c3/userdata/shm major:0 minor:487 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7/userdata/shm major:0 minor:674 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9/userdata/shm major:0 minor:468 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7e3651e4363549a04b82ec26394498f6d4ec7456d1ae54da099d3f3dc779acb3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7e3651e4363549a04b82ec26394498f6d4ec7456d1ae54da099d3f3dc779acb3/userdata/shm major:0 minor:665 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68/userdata/shm major:0 minor:508 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417/userdata/shm major:0 minor:263 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6/userdata/shm major:0 minor:84 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31/userdata/shm major:0 minor:393 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/91dfd404945dfa72b72f5ee55329cfd84189e41ee3e0a4c889c8d9f86c69e940/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/91dfd404945dfa72b72f5ee55329cfd84189e41ee3e0a4c889c8d9f86c69e940/userdata/shm major:0 minor:666 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64/userdata/shm major:0 minor:254 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9ed5d4e2f5f9d30b72c838a61c437059eadf2c2990c7efa7688884bd954ef475/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9ed5d4e2f5f9d30b72c838a61c437059eadf2c2990c7efa7688884bd954ef475/userdata/shm major:0 minor:392 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a31234df882122d07e602b31ef9412c0c41c548e3f6b29cd809ca5a3f68cae28/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a31234df882122d07e602b31ef9412c0c41c548e3f6b29cd809ca5a3f68cae28/userdata/shm major:0 minor:483 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5/userdata/shm major:0 minor:784 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89/userdata/shm major:0 minor:670 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc/userdata/shm major:0 minor:798 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79/userdata/shm major:0 minor:114 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ad38671dc0e01021059e694690e391eb7e5f662e10f37c5a4ec0d76fc36b9929/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ad38671dc0e01021059e694690e391eb7e5f662e10f37c5a4ec0d76fc36b9929/userdata/shm major:0 minor:803 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747/userdata/shm major:0 minor:603 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b5ba5b90d6ae03cfb78b058e1659ce46a060105bfebabaafdc41fb8977ded8b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b5ba5b90d6ae03cfb78b058e1659ce46a060105bfebabaafdc41fb8977ded8b7/userdata/shm major:0 minor:521 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b8ffdf31e610994af2b57b9f88a7795bae0cba26fb04c18d8c2445e3f0680a53/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b8ffdf31e610994af2b57b9f88a7795bae0cba26fb04c18d8c2445e3f0680a53/userdata/shm major:0 minor:423 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bfec511e21bbf0dfd869b19d5fa48c209028875c1e502c22cc890b2268398c69/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bfec511e21bbf0dfd869b19d5fa48c209028875c1e502c22cc890b2268398c69/userdata/shm major:0 minor:662 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cd26eb3215c3045fef95e54a836428d267922b388ae43b971d77bc3784523536/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cd26eb3215c3045fef95e54a836428d267922b388ae43b971d77bc3784523536/userdata/shm major:0 minor:791 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e/userdata/shm major:0 minor:126 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d7aae39ac0f8cea9552f3bc1f796a0a6a68481f6079d7651593a9da0b4c18f5a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d7aae39ac0f8cea9552f3bc1f796a0a6a68481f6079d7651593a9da0b4c18f5a/userdata/shm major:0 minor:672 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b/userdata/shm major:0 minor:790 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587/userdata/shm major:0 minor:259 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32/userdata/shm major:0 minor:466 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9/userdata/shm major:0 minor:439 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c/userdata/shm major:0 minor:245 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666/userdata/shm major:0 minor:792 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b/userdata/shm major:0 minor:143 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes/kubernetes.io~projected/kube-api-access-xggp6:{mountpoint:/var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes/kubernetes.io~projected/kube-api-access-xggp6 major:0 minor:462 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes/kubernetes.io~secret/serving-cert major:0 minor:383 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~projected/ca-certs major:0 minor:421 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~projected/kube-api-access-pxrnf:{mountpoint:/var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~projected/kube-api-access-pxrnf major:0 minor:422 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:435 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/02d5a507-4409-44b4-98bc-1751cdcc6c6a/volumes/kubernetes.io~projected/kube-api-access-jd4jz:{mountpoint:/var/lib/kubelet/pods/02d5a507-4409-44b4-98bc-1751cdcc6c6a/volumes/kubernetes.io~projected/kube-api-access-jd4jz major:0 minor:772 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/02d5a507-4409-44b4-98bc-1751cdcc6c6a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/02d5a507-4409-44b4-98bc-1751cdcc6c6a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:763 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~projected/kube-api-access-m9gmt:{mountpoint:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~projected/kube-api-access-m9gmt major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~secret/serving-cert major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/10498208-0692-4533-b672-a7a2cfcdf1be/volumes/kubernetes.io~projected/kube-api-access-xdwfl:{mountpoint:/var/lib/kubelet/pods/10498208-0692-4533-b672-a7a2cfcdf1be/volumes/kubernetes.io~projected/kube-api-access-xdwfl major:0 minor:118 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/114b1d16-b37d-449c-84e3-3fb3f8b20eaa/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/114b1d16-b37d-449c-84e3-3fb3f8b20eaa/volumes/kubernetes.io~projected/kube-api-access major:0 minor:471 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/114b1d16-b37d-449c-84e3-3fb3f8b20eaa/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/114b1d16-b37d-449c-84e3-3fb3f8b20eaa/volumes/kubernetes.io~secret/serving-cert major:0 minor:470 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/15bf86d9-62b3-4af8-b6f6-23131d712332/volumes/kubernetes.io~projected/kube-api-access-g7tct:{mountpoint:/var/lib/kubelet/pods/15bf86d9-62b3-4af8-b6f6-23131d712332/volumes/kubernetes.io~projected/kube-api-access-g7tct major:0 minor:391 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/15bf86d9-62b3-4af8-b6f6-23131d712332/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/15bf86d9-62b3-4af8-b6f6-23131d712332/volumes/kubernetes.io~secret/signing-key major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/269d77d9-815e-4324-8827-1ce429063ed1/volumes/kubernetes.io~projected/kube-api-access-jxt29:{mountpoint:/var/lib/kubelet/pods/269d77d9-815e-4324-8827-1ce429063ed1/volumes/kubernetes.io~projected/kube-api-access-jxt29 major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bab9dba-235f-467c-9224-634cca9acbd2/volumes/kubernetes.io~projected/kube-api-access-cv9kn:{mountpoint:/var/lib/kubelet/pods/2bab9dba-235f-467c-9224-634cca9acbd2/volumes/kubernetes.io~projected/kube-api-access-cv9kn major:0 minor:780 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2bab9dba-235f-467c-9224-634cca9acbd2/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/2bab9dba-235f-467c-9224-634cca9acbd2/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:776 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~projected/kube-api-access-b2f6r:{mountpoint:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~projected/kube-api-access-b2f6r major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/36852fda-6aee-4a36-8724-537f1260c4c8/volumes/kubernetes.io~projected/kube-api-access-54547:{mountpoint:/var/lib/kubelet/pods/36852fda-6aee-4a36-8724-537f1260c4c8/volumes/kubernetes.io~projected/kube-api-access-54547 major:0 minor:510 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~projected/kube-api-access-p97xk:{mountpoint:/var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~projected/kube-api-access-p97xk major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:658 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~projected/kube-api-access-xx2c4:{mountpoint:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~projected/kube-api-access-xx2c4 major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~projected/kube-api-access-wg27g:{mountpoint:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~projected/kube-api-access-wg27g major:0 minor:142 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~secret/webhook-cert major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/54612733-158f-4a92-a1bf-f4a8d653ffaf/volumes/kubernetes.io~projected/kube-api-access-9lnbq:{mountpoint:/var/lib/kubelet/pods/54612733-158f-4a92-a1bf-f4a8d653ffaf/volumes/kubernetes.io~projected/kube-api-access-9lnbq major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~projected/kube-api-access-svpvs:{mountpoint:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~projected/kube-api-access-svpvs major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/580bafd6-af8c-4961-b959-b736a180e309/volumes/kubernetes.io~projected/kube-api-access-lf74f:{mountpoint:/var/lib/kubelet/pods/580bafd6-af8c-4961-b959-b736a180e309/volumes/kubernetes.io~projected/kube-api-access-lf74f major:0 minor:338 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5a012d0b-d1a8-4cd3-8b91-b346d0445f24/volumes/kubernetes.io~projected/kube-api-access-9bx48:{mountpoint:/var/lib/kubelet/pods/5a012d0b-d1a8-4cd3-8b91-b346d0445f24/volumes/kubernetes.io~projected/kube-api-access-9bx48 major:0 minor:255 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~projected/kube-api-access-kv9fk:{mountpoint:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~projected/kube-api-access-kv9fk major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~secret/metrics-tls major:0 minor:101 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/volumes/kubernetes.io~projected/kube-api-access-tb249:{mountpoint:/var/lib/kubelet/pods/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/volumes/kubernetes.io~projected/kube-api-access-tb249 major:0 minor:771 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:751 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/632651f7-6641-49d8-9c48-7f6ea5846538/volumes/kubernetes.io~projected/kube-api-access-86k82:{mountpoint:/var/lib/kubelet/pods/632651f7-6641-49d8-9c48-7f6ea5846538/volumes/kubernetes.io~projected/kube-api-access-86k82 major:0 minor:782 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/632651f7-6641-49d8-9c48-7f6ea5846538/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/632651f7-6641-49d8-9c48-7f6ea5846538/volumes/kubernetes.io~secret/cert major:0 minor:777 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~projected/kube-api-access major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/666857a1-0ddf-4b48-91f4-44cce154d1b1/volumes/kubernetes.io~projected/kube-api-access-vrqx7:{mountpoint:/var/lib/kubelet/pods/666857a1-0ddf-4b48-91f4-44cce154d1b1/volumes/kubernetes.io~projected/kube-api-access-vrqx7 major:0 minor:92 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68c57a64-f30c-4caf-89ef-08bd0d36833e/volumes/kubernetes.io~projected/kube-api-access-x4nbb:{mountpoint:/var/lib/kubelet/pods/68c57a64-f30c-4caf-89ef-08bd0d36833e/volumes/kubernetes.io~projected/kube-api-access-x4nbb major:0 minor:783 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/68c57a64-f30c-4caf-89ef-08bd0d36833e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/68c57a64-f30c-4caf-89ef-08bd0d36833e/volumes/kubernetes.io~secret/serving-cert major:0 minor:773 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~projected/kube-api-access-dcb5s:{mountpoint:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~projected/kube-api-access-dcb5s major:0 minor:602 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/encryption-config major:0 minor:600 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/etcd-client major:0 minor:601 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/serving-cert major:0 minor:596 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~projected/kube-api-access-jcltq:{mountpoint:/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~projected/kube-api-access-jcltq major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~secret/webhook-certs major:0 minor:660 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/volumes/kubernetes.io~projected/kube-api-access-wvrpf:{mountpoint:/var/lib/kubelet/pods/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/volumes/kubernetes.io~projected/kube-api-access-wvrpf major:0 minor:370 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/81cb0504-9455-4398-aed1-5cc6790f292e/volumes/kubernetes.io~projected/kube-api-access-j5jgq:{mountpoint:/var/lib/kubelet/pods/81cb0504-9455-4398-aed1-5cc6790f292e/volumes/kubernetes.io~projected/kube-api-access-j5jgq major:0 minor:754 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/81cb0504-9455-4398-aed1-5cc6790f292e/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/81cb0504-9455-4398-aed1-5cc6790f292e/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:752 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~projected/kube-api-access major:0 minor:253 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~secret/serving-cert major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99a11fe6-48a1-439e-b788-158dbe267dcd/volumes/kubernetes.io~projected/kube-api-access-ns72p:{mountpoint:/var/lib/kubelet/pods/99a11fe6-48a1-439e-b788-158dbe267dcd/volumes/kubernetes.io~projected/kube-api-access-ns72p major:0 minor:781 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/99a11fe6-48a1-439e-b788-158dbe267dcd/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/99a11fe6-48a1-439e-b788-158dbe267dcd/volumes/kubernetes.io~secret/proxy-tls major:0 minor:778 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~projected/kube-api-access major:0 minor:252 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~secret/serving-cert major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~projected/kube-api-access-xmlzw:{mountpoint:/var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~projected/kube-api-access-xmlzw major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~secret/srv-cert major:0 minor:659 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9d47f860-d64a-49b8-b404-a67cbc2faeb6/volumes/kubernetes.io~projected/kube-api-access-njx9l:{mountpoint:/var/lib/kubelet/pods/9d47f860-d64a-49b8-b404-a67cbc2faeb6/volumes/kubernetes.io~projected/kube-api-access-njx9l major:0 minor:507 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9d47f860-d64a-49b8-b404-a67cbc2faeb6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/9d47f860-d64a-49b8-b404-a67cbc2faeb6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:506 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~projected/kube-api-access-x8mvz:{mountpoint:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~projected/kube-api-access-x8mvz major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/kube-api-access-ft5sd:{mountpoint:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/kube-api-access-ft5sd major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~secret/metrics-tls major:0 minor:477 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~projected/kube-api-access-9dhfq:{mountpoint:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~projected/kube-api-access-9dhfq major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~secret/serving-cert major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa8ddfdd-7f2d-4fd4-b666-1497dee752df/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/aa8ddfdd-7f2d-4fd4-b666-1497dee752df/volumes/kubernetes.io~projected/ca-certs major:0 minor:415 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/aa8ddfdd-7f2d-4fd4-b666-1497dee752df/volumes/kubernetes.io~projected/kube-api-access-6wj8x:{mountpoint:/var/lib/kubelet/pods/aa8ddfdd-7f2d-4fd4-b666-1497dee752df/volumes/kubernetes.io~projected/kube-api-access-6wj8x major:0 minor:420 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~projected/kube-api-access-pwfct:{mountpoint:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~projected/kube-api-access-pwfct major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~projected/kube-api-access-6l57v:{mountpoint:/var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~projected/kube-api-access-6l57v major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:657 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~projected/kube-api-access-fqcrz:{mountpoint:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~projected/kube-api-access-fqcrz major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~secret/serving-cert major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~projected/kube-api-access-s4r7v:{mountpoint:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~projected/kube-api-access-s4r7v major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:478 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:480 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c62edaec-38e2-4b73-8bb5-c776abfb310f/volumes/kubernetes.io~projected/kube-api-access-cdfnh:{mountpoint:/var/lib/kubelet/pods/c62edaec-38e2-4b73-8bb5-c776abfb310f/volumes/kubernetes.io~projected/kube-api-access-cdfnh major:0 minor:710 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c62edaec-38e2-4b73-8bb5-c776abfb310f/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/c62edaec-38e2-4b73-8bb5-c776abfb310f/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:709 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~projected/kube-api-access-94src:{mountpoint:/var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~projected/kube-api-access-94src major:0 minor:779 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~secret/cert major:0 minor:775 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:774 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~projected/kube-api-access-kgxbv:{mountpoint:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~projected/kube-api-access-kgxbv major:0 minor:515 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/encryption-config major:0 minor:504 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/etcd-client major:0 minor:505 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/serving-cert major:0 minor:514 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes/kubernetes.io~projected/kube-api-access-btbt7:{mountpoint:/var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes/kubernetes.io~projected/kube-api-access-btbt7 major:0 minor:402 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes/kubernetes.io~secret/serving-cert major:0 minor:107 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/kube-api-access-r5xlx:{mountpoint:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/kube-api-access-r5xlx major:0 minor:248 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:481 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~projected/kube-api-access-jq9d5:{mountpoint:/var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~projected/kube-api-access-jq9d5 major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:661 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~projected/kube-api-access-zdvf6:{mountpoint:/var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~projected/kube-api-access-zdvf6 major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~secret/srv-cert major:0 minor:648 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~projected/kube-api-access-tq5c7:{mountpoint:/var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~projected/kube-api-access-tq5c7 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~secret/metrics-certs major:0 minor:647 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~projected/kube-api-access-xbztv:{mountpoint:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~projected/kube-api-access-xbztv major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:570 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~empty-dir/tmp major:0 minor:579 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~projected/kube-api-access-qhwbb:{mountpoint:/var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~projected/kube-api-access-qhwbb major:0 minor:584 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~projected/kube-api-access-gd645:{mountpoint:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~projected/kube-api-access-gd645 major:0 minor:133 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:132 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f19c3c89-8d32-4394-bd86-e5ef7734c42b/volumes/kubernetes.io~projected/kube-api-access-s28gq:{mountpoint:/var/lib/kubelet/pods/f19c3c89-8d32-4394-bd86-e5ef7734c42b/volumes/kubernetes.io~projected/kube-api-access-s28gq major:0 minor:770 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f19c3c89-8d32-4394-bd86-e5ef7734c42b/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/f19c3c89-8d32-4394-bd86-e5ef7734c42b/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:551 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3b704e7-1291-4645-8a0d-2a937829d7ac/volumes/kubernetes.io~projected/kube-api-access-zct7x:{mountpoint:/var/lib/kubelet/pods/f3b704e7-1291-4645-8a0d-2a937829d7ac/volumes/kubernetes.io~projected/kube-api-access-zct7x major:0 minor:753 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3b704e7-1291-4645-8a0d-2a937829d7ac/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f3b704e7-1291-4645-8a0d-2a937829d7ac/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:731 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~projected/kube-api-access-6flvz:{mountpoint:/var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd Mar 12 12:24:27.891223 master-0 kubenswrapper[13984]: 30-db82e7f16cd7/volumes/kubernetes.io~projected/kube-api-access-6flvz major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~secret/metrics-tls major:0 minor:479 fsType:tmpfs blockSize:0} overlay_0-105:{mountpoint:/var/lib/containers/storage/overlay/6918e18f41709fa6650e2480e0000edf9b35beb3f771ce3fb624c8d3824ab074/merged major:0 minor:105 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/7327adbcb3e0694907e9a2311dbe59e4abb9235f2db5ca6622d68842ecc9f511/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/1a23ca58d99cfb96c0a7416d1fcf8c85e5759214533a5f47defdfac9c30c53f3/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/9f44eb1099f6e7ca6b2919c25fe6b16f301625ab35847f6535e8a0bed60948b6/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-128:{mountpoint:/var/lib/containers/storage/overlay/0227392e62943a5421d5117a45613bb529ec58a6b4e1020c6c4724f64bfeb49b/merged major:0 minor:128 fsType:overlay blockSize:0} overlay_0-130:{mountpoint:/var/lib/containers/storage/overlay/de56fda083a3a46b5d9dfe72d06ed1a6f006d2faea2846b62646a3c68dc246fe/merged major:0 minor:130 fsType:overlay blockSize:0} overlay_0-139:{mountpoint:/var/lib/containers/storage/overlay/1de229b94ac4c87df95373ce1d4b224ba337afd7df3c6e2863683e588756d225/merged major:0 minor:139 fsType:overlay blockSize:0} overlay_0-145:{mountpoint:/var/lib/containers/storage/overlay/0e5100b29e5eb59624ed266a476b13b49716cd035088507dca49a2efb851c004/merged major:0 minor:145 fsType:overlay blockSize:0} overlay_0-147:{mountpoint:/var/lib/containers/storage/overlay/123db14c086e03a6fa2a93cd350da3c7d3f6bad3562d41fb9a648bd89953d36e/merged major:0 minor:147 fsType:overlay blockSize:0} overlay_0-149:{mountpoint:/var/lib/containers/storage/overlay/5671824a14c759f74bac5e3e4d4eda71da1e428a35c6a1d759cd3f101ac39dd4/merged major:0 minor:149 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/f99c96a5d52735f372e2f64dc3e047205afbea3602c688881c03774572b374ad/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/e77186afdf459bdc81ad0fbf9f8d84fb80abcdefef820e33c453f65ab9511bfd/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/732ffaaaeed58a2af6625eb1cba1ad5af20d11ed9e1cb2c36fdf3dacc399b752/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-174:{mountpoint:/var/lib/containers/storage/overlay/f01dadb2943df2cdd191d4eb77d10fc8d66487e339539769133e56d239134687/merged major:0 minor:174 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/14455f71ceb3566330a851bdd9c0beba40fd46e8b1a3114d513bef5cd9c7697d/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/8147fd49d7060556f1a6d7ed71b7be78090cee700273c0a3331008c154f3317a/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/e6c234ffa99a8fbfee78f7c28394d10b4f69637de07ad9450a1130477f7178b0/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/b93afabc703f079e793f8c205c4d7498282b6adc4b6926f0310613f4335fa7d9/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/8da7d23860429d271aadedb88a691fe8c0acd58cd6632612c95032b9cf8d7582/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/2d7aab61a22bec48a13bbbf12c4375b33a2b6cb4ce4ba67fe69048ec0ff7e039/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-250:{mountpoint:/var/lib/containers/storage/overlay/d1934d6434bb66e17ad49f3f3037da293f0464476f24037ea944c55e783f9aa6/merged major:0 minor:250 fsType:overlay blockSize:0} overlay_0-269:{mountpoint:/var/lib/containers/storage/overlay/7c69419df66553e83b8bc70bd47f65c61c4b96e95005d7d20c25d1dc33031029/merged major:0 minor:269 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/019d4165db29f7ead30fd39128bf626db5ad5165f5130442249ebe567fc975bf/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/eb487e733038a49d042ad434ccc6ecf82eb467eb91b2eab81747168b2bb31db6/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/1ed9e727adc96ddf7693cc5d755fd8bf0f8dd50d4e80c21b660558df9055ee06/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/5877efa5f57baf9890204c4bb74d7c4023b643b83f62ef3151c90d81295ebb42/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/dd0828333b7e61e4f647add8c3cfdbf06d9b31077977717f0d765cb044fc1cf1/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/72045d9d16c8d4e408a28ab3fd7f16afcbe80618d9761ac94ad62e22f2aeaf78/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/40f5fc8b9d4d34715d9ba18873dccbc16b47a4f4fe4218c9aeca37d5187c269f/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/43a25658cd2c258fda0386101fe97332398cc34add2d7f4d6b4b39c126524a10/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/cb7a9227e97c2d237f018ec73f0a458c58aa44cb71edf741e0cbd0e71409cd14/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/16ae3329f6bf42aac4b1fba0b90ace71b79e779f37b5a17ad7706f45a12bce05/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/fc953aaad4925eaf9eb3ae2e191f68666d9a369f6663b747ad38ccd49f1f0704/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/07980bb4a8edc5c6a56e02370ac7957e26e5ed8153dffc34a300ce7830582c08/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/f7199d502447f14b64bf11ecb2b42cd7c792c4095ac1bb941d000fc3ab8046dd/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/49d5468cf24d1cd7f03facb60b122c69da2d367a764a7e3f8fac8780a188ed76/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-315:{mountpoint:/var/lib/containers/storage/overlay/4040521e5b0c6062ae89fd58ce5efbacbde5dca6318cf2c7b124f3e4ec04f172/merged major:0 minor:315 fsType:overlay blockSize:0} overlay_0-316:{mountpoint:/var/lib/containers/storage/overlay/2aa368cd2316f83a261a7623567f962e7856d939e0ea820e84780affe9fff192/merged major:0 minor:316 fsType:overlay blockSize:0} overlay_0-317:{mountpoint:/var/lib/containers/storage/overlay/b39a0e58c3c178995b2447788945536343a3a00eef2107397c2efd8fbe3cbe37/merged major:0 minor:317 fsType:overlay blockSize:0} overlay_0-322:{mountpoint:/var/lib/containers/storage/overlay/c43891ff6537cd3e36696e9fcfa4f03bc6f324a655fed71cc1b29d3bb77036ef/merged major:0 minor:322 fsType:overlay blockSize:0} overlay_0-324:{mountpoint:/var/lib/containers/storage/overlay/0194f21c02ecfed006b9c3b6f32f8b8c4cd54ce31824e07b57ba1b1079cf7352/merged major:0 minor:324 fsType:overlay blockSize:0} overlay_0-326:{mountpoint:/var/lib/containers/storage/overlay/e824f0180ba483e223d0a8eece3c384118c5aa9dd591e44aa57e6de35199ac9a/merged major:0 minor:326 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/ce056ad6348d2c8d6a15cc4aa2c73d8fe93972c5bb249e1eacb74c0fd974776e/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-330:{mountpoint:/var/lib/containers/storage/overlay/6976623c2ccce15328834d20b9958fb01d50e2aeaf6c9ae9b78c6966e9164d0e/merged major:0 minor:330 fsType:overlay blockSize:0} overlay_0-332:{mountpoint:/var/lib/containers/storage/overlay/fc3ce1a14c4850cfd4426416e9c9a189f3d8441aab7a8a73208c9a303382ab0d/merged major:0 minor:332 fsType:overlay blockSize:0} overlay_0-334:{mountpoint:/var/lib/containers/storage/overlay/3dc17d88daa8b867afe689e68be022bd438f7eea8a691e687cde1929214545a3/merged major:0 minor:334 fsType:overlay blockSize:0} overlay_0-340:{mountpoint:/var/lib/containers/storage/overlay/6fbd2744b6f3931c9a4970b555c4cd17c678e8338a25f921b9c873b474ece728/merged major:0 minor:340 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/6df83868cec55941817ac188980a76a8fc6543b3e5f900c08ee62a16a06dc3d8/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/adb120634641b6c7e1c1021cddf46ccd4c7ad7dd357163134e9262c1c2bbd2f5/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-355:{mountpoint:/var/lib/containers/storage/overlay/97bc42aacba726001289f1a4920eb2f3c3ffef54cd1e07c808fc60274e1ff040/merged major:0 minor:355 fsType:overlay blockSize:0} overlay_0-356:{mountpoint:/var/lib/containers/storage/overlay/9700ee87cc9f40503560c6a20a0fd9e37d4b9282a07e3944b4c7aacc46d34321/merged major:0 minor:356 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/b81df5cfdab715913cc0bfa657e3ca35585e9a2b8ecdc7150d385cfdcbb4bd1d/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-378:{mountpoint:/var/lib/containers/storage/overlay/d44ce352353df87e40ba90790635a198d7a5d5dd19feae56ca14bdb2117f6321/merged major:0 minor:378 fsType:overlay blockSize:0} overlay_0-381:{mountpoint:/var/lib/containers/storage/overlay/9f22f9b86ac590b8fb97bbd2991b42844ab9aa390dae5d2c5fe398f9de33c973/merged major:0 minor:381 fsType:overlay blockSize:0} overlay_0-396:{mountpoint:/var/lib/containers/storage/overlay/6049fbfb4382be2b2770f73dbef8e8869a225350764f8a24a5883a965c973484/merged major:0 minor:396 fsType:overlay blockSize:0} overlay_0-398:{mountpoint:/var/lib/containers/storage/overlay/e13e0107e5d531524e4f64f7d172a699fe7714bdb9b42ec17c87d033314919f4/merged major:0 minor:398 fsType:overlay blockSize:0} overlay_0-400:{mountpoint:/var/lib/containers/storage/overlay/2c79e8782990aff2aaad47dcbf0bc62818c65edfe9dc7afa08e2839e4602921e/merged major:0 minor:400 fsType:overlay blockSize:0} overlay_0-411:{mountpoint:/var/lib/containers/storage/overlay/6bd48abc51921995b47b81feced78aff1d038da8e2f26888c6cf367408f258eb/merged major:0 minor:411 fsType:overlay blockSize:0} overlay_0-413:{mountpoint:/var/lib/containers/storage/overlay/93b12299ee060f972b8c64375d295209c775dec054c686fa702270b4478a76db/merged major:0 minor:413 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/b8d9fb64d0a86c9579df1b69dbb7fb9381eba8423ec1b6ffa5c4b6d538e1db66/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-427:{mountpoint:/var/lib/containers/storage/overlay/ff03f28fef2723b6ec6560f920bd60d8d5bf98b4ddb1533ef5ba7b771fe6a077/merged major:0 minor:427 fsType:overlay blockSize:0} overlay_0-43:{mountpoint:/var/lib/containers/storage/overlay/b0a0865328380b98d15d56326f4948b9b2db57aa4825a30611f4103fa370364a/merged major:0 minor:43 fsType:overlay blockSize:0} overlay_0-434:{mountpoint:/var/lib/containers/storage/overlay/c3235866cd7b4829de0d22864b904e6b762805e834b9c00b1f58470cfb3a1275/merged major:0 minor:434 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/74dcc27038dab84c831d6a2b1d09f0302211024d763dd3c18ae2b24726d50f9f/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-441:{mountpoint:/var/lib/containers/storage/overlay/be3d105877b346a1530e2fdce2f96a71fa651c87e75cecab4fcab169f2bd3d89/merged major:0 minor:441 fsType:overlay blockSize:0} overlay_0-443:{mountpoint:/var/lib/containers/storage/overlay/8c5a661a2d759e9c5403bc0946bcefff250f56371e3ff1913462fd780fe37625/merged major:0 minor:443 fsType:overlay blockSize:0} overlay_0-449:{mountpoint:/var/lib/containers/storage/overlay/0a7e81f2dbae6528adc35236c929df4854fc35e95722d98f7fc16b9deba1847d/merged major:0 minor:449 fsType:overlay blockSize:0} overlay_0-453:{mountpoint:/var/lib/containers/storage/overlay/8c3f406052fbd050e4cf500d341c379b74838d1d2308b8812cbc60a4e0a0ac95/merged major:0 minor:453 fsType:overlay blockSize:0} overlay_0-455:{mountpoint:/var/lib/containers/storage/overlay/672aa6d2122e34104b999693ca2d2261603281ba4311dc0873cbb1adc6b94f2d/merged major:0 minor:455 fsType:overlay blockSize:0} overlay_0-46:{mountpoint:/var/lib/containers/storage/overlay/87ac806201bed2599ca8e4ae6ee1e47c3fd890a15786b1d01d55cf6c4060afdb/merged major:0 minor:46 fsType:overlay blockSize:0} overlay_0-475:{mountpoint:/var/lib/containers/storage/overlay/3e47aa4b5be9c6f36c194c1a17724e7d94e5dad49ee94a16d1e9102d5c507864/merged major:0 minor:475 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/bf115911e83645c02f0385293947427caa9edcd57db7b4e543604574ee4d751e/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-482:{mountpoint:/var/lib/containers/storage/overlay/e570219c72b6b2d602176f83141d0c891e0271e8994001cb76888c544af76158/merged major:0 minor:482 fsType:overlay blockSize:0} overlay_0-491:{mountpoint:/var/lib/containers/storage/overlay/6c9390eb58ea7298f0e85c1f19ee6e9553a0b1cd6226d6b782be094d83fdbbae/merged major:0 minor:491 fsType:overlay blockSize:0} overlay_0-494:{mountpoint:/var/lib/containers/storage/overlay/ae9e5b5a138e43627b5fbe5ad0274594f92d52eb031927a90f838012b83461eb/merged major:0 minor:494 fsType:overlay blockSize:0} overlay_0-495:{mountpoint:/var/lib/containers/storage/overlay/c988ac22faff3797db1cabed948890e7b2be95ece8a2103b88827f023a8ba290/merged major:0 minor:495 fsType:overlay blockSize:0} overlay_0-497:{mountpoint:/var/lib/containers/storage/overlay/c5537285e2840aab1b8c74e1999478d0d2dd73404befb4b4c6b9e67a3a18a070/merged major:0 minor:497 fsType:overlay blockSize:0} overlay_0-499:{mountpoint:/var/lib/containers/storage/overlay/de65dc664fd7c711eb9b48d66eca505232c79d0607d5071ba8143b2972a75493/merged major:0 minor:499 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/e57a6c110d280335e566188b0105fb8b9361503a7b5f30dc52c0ee762a9111f3/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-51:{mountpoint:/var/lib/containers/storage/overlay/bcb6974f4c5a97be1af9e8d02858fad7ce1d3c959aa0040eb0511baf4ded7a9b/merged major:0 minor:51 fsType:overlay blockSize:0} overlay_0-511:{mountpoint:/var/lib/containers/storage/overlay/285f2b56fde17be6e519b99a7abd2093d6046033fd20871c2dad1bc6e8c4fc2c/merged major:0 minor:511 fsType:overlay blockSize:0} overlay_0-516:{mountpoint:/var/lib/containers/storage/overlay/116070ad799b9fcf66f9c62548a2c50988d09db0a9210f0cb0ed2d3454eab10d/merged major:0 minor:516 fsType:overlay blockSize:0} overlay_0-520:{mountpoint:/var/lib/containers/storage/overlay/4f8af2db388b3d26a26ac8a11d2daae2f047e41af01b8d060a16d8ef887f3b18/merged major:0 minor:520 fsType:overlay blockSize:0} overlay_0-523:{mountpoint:/var/lib/containers/storage/overlay/e36a155b6442b7d0a1d34b335802573baf8876daa788c4ed694ad930faeb5386/merged major:0 minor:523 fsType:overlay blockSize:0} overlay_0-528:{mountpoint:/var/lib/containers/storage/overlay/c78cc3d37d36f9e4631cd20af74e11bf92ce697249d7805f0eb96ea39265bf41/merged major:0 minor:528 fsType:overlay blockSize:0} overlay_0-530:{mountpoint:/var/lib/containers/storage/overlay/20a791d392fb7fe561401ad69ca67f2df7445fa4c9fc357fb0d8fa03192011b2/merged major:0 minor:530 fsType:overlay blockSize:0} overlay_0-532:{mountpoint:/var/lib/containers/storage/overlay/7f958ca2ed9c467c7269b709562a96ea1838ea26d8a10330fd00f6dd4cdbd934/merged major:0 minor:532 fsType:overlay blockSize:0} overlay_0-550:{mountpoint:/var/lib/containers/storage/overlay/e86ecf758eca6c5e24325c9caca8e12b76502a175e7a5034ab36a4502e08c62a/merged major:0 minor:550 fsType:overlay blockSize:0} overlay_0-555:{mountpoint:/var/lib/containers/storage/overlay/edf792bec232da00e6d5d791f132375b368044ff05f4b73f0568ac10da4667ad/merged major:0 minor:555 fsType:overlay blockSize:0} overlay_0-556:{mountpoint:/var/lib/containers/storage/overlay/ae12ad0224a272ac27487cba688898174e6af99643d9c6c090570246dbfef1b8/merged major:0 minor:556 fsType:overlay blockSize:0} overlay_0-558:{mountpoint:/var/lib/containers/storage/overlay/630216cb1da2c620d8f34c18df3b435c485a78af4f5bcb60cbfab22f5eff1384/merged major:0 minor:558 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/6ccaa384cb89c5e36d1b38242e7b182395e30b3129e51672b97b7e210b48f604/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-561:{mountpoint:/var/lib/containers/storage/overlay/bc12d1893c76f47721f142d1a4885e7d99d02a2c43f7e4d4e00ff54ef6bdd146/merged major:0 minor:561 fsType:overlay blockSize:0} overlay_0-562:{mountpoint:/var/lib/containers/storage/overlay/cb9ee8e9c4e1c70a63e836188bddb285bf0f9d1d213edbf3bcf5dd4b344cf92f/merged major:0 minor:562 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/83a45936678a201c028cde3bcaea947d8850eb8b692e77494da57365356d3226/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-572:{mountpoint:/var/lib/containers/storage/overlay/a1c7c874a3271d6a89473b8347096917eba5c375a7c7887fbc69eaf3377ca418/merged major:0 minor:572 fsType:overlay blockSize:0} overlay_0-586:{mountpoint:/var/lib/containers/storage/overlay/9745a1d0104603eb2fc7852f04cd67b0d01ee0ff59d691ba7af2652b78d8cf2b/merged major:0 minor:586 fsType:overlay blockSize:0} overlay_0-59:{mountpoint:/var/lib/containers/storage/overlay/92cd335182bc033ccee27d589c6e292b162487efc54b69f3df12f2ebfb338a93/merged major:0 minor:59 fsType:overlay blockSize:0} overlay_0-592:{mountpoint:/var/lib/containers/storage/overlay/db650c1684be05f57464690be5043327a033298ee8cdcb854292f1f4b99a695c/merged major:0 minor:592 fsType:overlay blockSize:0} overlay_0-594:{mountpoint:/var/lib/containers/storage/overlay/49976aafedc3170f8aa1e837bb01d4c16663768c4641130a9839dc76657ee70c/merged major:0 minor:594 fsType:overlay blockSize:0} overlay_0-605:{mountpoint:/var/lib/containers/storage/overlay/7aa696c7e9741f3b35c63277442fd03dc9c3b1b8e331c6598ec8ca5612e0630c/merged major:0 minor:605 fsType:overlay blockSize:0} overlay_0-618:{mountpoint:/var/lib/containers/storage/overlay/7ae6db05caaaf77789292764c1af19f402d73007e9049c5395512cecbc8c4ed0/merged major:0 minor:618 fsType:overlay blockSize:0} overlay_0-623:{mountpoint:/var/lib/containers/storage/overlay/1ab6a4fd4777e9e993a0ef5c3abc16be26a728f38c75bc1cebd4c9c9468f0da6/merged major:0 minor:623 fsType:overlay blockSize:0} overlay_0-626:{mountpoint:/var/lib/containers/storage/overlay/c5ed1fc06186b253e2fe08828a27c981916f75e5a8943a8389ea334833b1702a/merged major:0 minor:626 fsType:overlay blockSize:0} overlay_0-637:{mountpoint:/var/lib/containers/storage/overlay/540c47090aece219d124a21fd503255d23a7dacfe7536207aa2fd91db5070679/merged major:0 minor:637 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/412c42b0776f613df634cab254bf5cc78b6cb21700a23427fa2bcb80a768e923/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-67:{mountpoint:/var/lib/containers/storage/overlay/10877c46ba574aead997a0b6fdb0235e4af4aedb5b8e117a308d2695f0272274/merged major:0 minor:67 fsType:overlay blockSize:0} overlay_0-676:{mountpoint:/var/lib/containers/storage/overlay/e8b1cda46c4040a29914863e2a9ad91e4fd4f402a25de90aee29dee8853ac33b/merged major:0 minor:676 fsType:overlay blockSize:0} overlay_0-678:{mountpoint:/var/lib/containers/storage/overlay/478e6a2fdcfa4a2e48d15e017fb3527e4160c7f9904f4c4fc2a578d322817bbe/merged major:0 minor:678 fsType:overlay blockSize:0} overlay_0-680:{mountpoint:/var/lib/containers/storage/overlay/a9a18bf34cbba6acc1b4e6e9483f81e7fb26bb8cb24cbf4abacbfdd6c8be5ddf/merged major:0 minor:680 fsType:overlay blockSize:0} overlay_0-682:{mountpoint:/var/lib/containers/storage/overlay/9e8bc5e3759e929d8388e86d3d6388e1a9b6a2e27c1044df6caa071c9b4f6bd6/merged major:0 minor:682 fsType:overlay blockSize:0} overlay_0-684:{mountpoint:/var/lib/containers/storage/overlay/94facd7e398a235dd7255759ab02c5508af6ec9e9c2d36c1d0e8b04186a3393c/merged major:0 minor:684 fsType:overlay blockSize:0} overlay_0-687:{mountpoint:/var/lib/containers/storage/overlay/75c7d92dec76b489af5a80547372b8043d61ad1d589c5cf316277351f349cead/merged major:0 minor:687 fsType:overlay blockSize:0} overlay_0-689:{mountpoint:/var/lib/containers/storage/overlay/597b284a2fddf863ff759d777ec57cb320e8a95e10f4c45776b35579266e367f/merged major:0 minor:689 fsType:overlay blockSize:0} overlay_0-693:{mountpoint:/var/lib/containers/storage/overlay/31b1ead1c9398baa6f28b36b7fd07e40726081e0cde98e922b7c39fe5bfb2a40/merged major:0 minor:693 fsType:overlay blockSize:0} overlay_0-713:{mountpoint:/var/lib/containers/storage/overlay/5c7a58337dfd5db0154a442a30538f6838e0487dd3821257692c7fc2231c84ec/merged major:0 minor:713 fsType:overlay blockSize:0} overlay_0-717:{mountpoint:/var/lib/containers/storage/overlay/50af32c970a90f5b7d2be9401d9b183fd4e51148c79bcf365284f6e5161c038e/merged major:0 minor:717 fsType:overlay blockSize:0} overlay_0-719:{mountpoint:/var/lib/containers/storage/overlay/0dbe96ca6ba6c9e6b6584e13fdc6eaff92b12fdc9a1faa3b3c0f417e505c46af/merged major:0 minor:719 fsType:overlay blockSize:0} overlay_0-721:{mountpoint:/var/lib/containers/storage/overlay/a848ff4aa10aa65908542333c48763d0f4e55403c4834e65303b5a0b5a525f7b/merged major:0 minor:721 fsType:overlay blockSize:0} overlay_0-724:{mountpoint:/var/lib/containers/storage/overlay/eb1e70b1fd2eb3bb086613ac39be5a7b9b9cf3f8f41ed28a88b3e5905ec46d76/merged major:0 minor:724 fsType:overlay blockSize:0} overlay_0-73:{mountpoint:/var/lib/containers/storage/overlay/b8d4580c789aa3fd5d21301c96c7cc66e4dd47ab295a9e8bd327b59997ee5e34/merged major:0 minor:73 fsType:overlay blockSize:0} overlay_0-735:{mountpoint:/var/lib/containers/storage/overlay/7b8c89258e30be0413fe8c495895f3f22f5906a1d02f0cee2884dd29d2e80c3b/merged major:0 minor:735 fsType:overlay blockSize:0} overlay_0-739:{mountpoint:/var/lib/containers/storage/overlay/269a4cc5d1f75541545de6be0af212b9f035e467144d1ee347a596393341c230/merged major:0 minor:739 fsType:overlay blockSize:0} overlay_0-741:{mountpoint:/var/lib/containers/storage/overlay/ae767fa4a090dd349e310767fd19edc2e36a873d3c56b70067156f95594c7454/merged major:0 minor:741 fsType:overlay blockSize:0} overlay_0-743:{mountpoint:/var/lib/containers/storage/overlay/a9edbd187262b2ca5fa6c9657e4eeee92c1cc45629b6019087002efd97e45f9b/merged major:0 minor:743 fsType:overlay blockSize:0} overlay_0-748:{mountpoint:/var/lib/containers/storage/overlay/d551d7caeabd5c47260494e0e5f037b235435a4184a9c245b9dc2d15ea9069ba/merged major:0 minor:748 fsType:overlay blockSize:0} overlay_0-749:{mountpoint:/var/lib/containers/storage/overlay/735fea7ea06907d742d1a51186a831aa56886ab355d37fbb6d0a7caa09aeb9e1/merged major:0 minor:749 fsType:overlay blockSize:0} overlay_0-75:{mountpoint:/var/lib/containers/storage/overlay/a5552781353e8823964109ccb87d4238ba622b4dcadb7a2b883929e748535e9c/merged major:0 minor:75 fsType:overlay blockSize:0} overlay_0-77:{mountpoint:/var/lib/containers/storage/overlay/d807df03bd80558e080e5179889e16846c613ccd806ae3b2e418a17f57891f24/merged major:0 minor:77 fsType:overlay blockSize:0} overlay_0-786:{mountpoint:/var/lib/containers/storage/overlay/4270a3333fe496acb5b2d816dafe702c7340329dfcb8f015374d39d114018fa0/merged major:0 minor:786 fsType:overlay blockSize:0} overlay_0-788:{mountpoint:/var/lib/containers/storage/overlay/0ef4f1623a85bf53a9f36f93c2d669192f66445fd9fdf3de04edafb737807fe6/merged major:0 minor:788 fsType:overlay blockSize:0} overlay_0-79:{mountpoint:/var/lib/containers/storage/overlay/4cd7b7579fec5957502962194c0c9480b35105dd53320a59259f9038f9a58636/merged major:0 minor:79 fsType:overlay blockSize:0} overlay_0-794:{mountpoint:/var/lib/containers/storage/overlay/502adc12fe062cf794fab1775131e56240e8b393e95f30c50f069b45549a3e99/merged major:0 minor:794 fsType:overlay blockSize:0} overlay_0-810:{mountpoint:/var/lib/containers/storage/overlay/609017501b05d883a6ac5e9f48488573804328f5b998338e152d05881836fcb8/merged major:0 minor:810 fsType:overlay blockSize:0} overlay_0-815:{mountpoint:/var/lib/containers/storage/overlay/82262ec3266e97b0989c5301270f283bd4148ba1f1805be3be4704cff0ea0b9c/merged major:0 minor:815 fsType:overlay blockSize:0} overlay_0-817:{mountpoint:/var/lib/containers/storage/overlay/da9394f51934ec2ea331355f4fcf6aa1d48b3d52e54142201ddcbf267f38a784/merged major:0 minor:817 fsType:overlay blockSize:0} overlay_0-818:{mountpoint:/var/lib/containers/storage/overlay/9a8870701b6ca02fb7698962ef991aa4079bb7b481c41b0558e1694fb1139625/merged major:0 minor:818 fsType:overlay blockSize:0} overlay_0-825:{mountpoint:/var/lib/containers/storage/overlay/e4badb9fcabbaa98ce9fddeedfc4ba823188b0f410727ad0aece4ab3e6bdc0b4/merged major:0 minor:825 fsType:overlay blockSize:0} overlay_0-831:{mountpoint:/var/lib/containers/storage/overlay/2f66885490ebc2cf324bbf084436f865d5ad87fac3df95ffa55487466e42638f/merged major:0 minor:831 fsType:overlay blockSize:0} overlay_0-833:{mountpoint:/var/lib/containers/storage/overlay/c87fcad3c2735db5b60627d55a7e2c207cf2b4dd0d7f0024696478b5fa14f20b/merged major:0 minor:833 fsType:overlay blockSize:0} overlay_0-835:{mountpoint:/var/lib/containers/storage/overlay/9fe240cd634ca013f6d0ef6852bbf2e89863df76a640a68bae802209910c45e3/merged major:0 minor:835 fsType:overlay blockSize:0} overlay_0-841:{mountpoint:/var/lib/containers/storage/overlay/d3385d91e01b99bf39414722ec65c65f8036396891bdb4910afc5037885f0cf8/merged major:0 minor:841 fsType:overlay blockSize:0} overlay_0-843:{mountpoint:/var/lib/containers/storage/overlay/c36ccfdb0b151919645a21fb886f7d6c6ee49de5d24955f439cd1dcfc79248d5/merged major:0 minor:843 fsType:overlay blockSize:0} overlay_0-845:{mountpoint:/var/lib/containers/storage/overlay/af63eef3403a2032f90c3f3cfb402b2b27079616de4ea5674a125b5b27e0bc90/merged major:0 minor:845 fsType:overlay blockSize:0} overlay_0-847:{mountpoint:/var/lib/containers/storage/overlay/2764c27994191e06e47c25b2c53314836b7bc62b79d63f53b49b732da805cc38/merged major:0 minor:847 fsType:overlay blockSize:0} overlay_0-849:{mountpoint:/var/lib/containers/storage/overlay/f86641277f60f8ae4d14adb410d1020243fa144ce9da72f1bcb1c48fa58b8472/merged major:0 minor:849 fsType:overlay blockSize:0} overlay_0-85:{mountpoint:/var/lib/containers/storage/overlay/25cf1b381f5e51a1261902d2045f8709b097b7335c052a65cfe66e3485020e81/merged major:0 minor:85 fsType:overlay blockSize:0} overlay_0-859:{mountpoint:/var/lib/containers/storage/overlay/7fc67ac52806428bbf4f4ef0dc94f93a3ca5740ee2dda395baf3cb128753cf9a/merged major:0 minor:859 fsType:overlay blockSize:0} overlay_0-87:{mountpoint:/var/lib/containers/storage/overlay/b6a462382c97fa3da77f7fd18efaacf7a1c3363de14cc6beb75611b168e54689/merged major:0 minor:87 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/0491b3c137c60d9d781a929c68ba6c4eb9b81ad867cc3d16db6077961f7a043a/merged major:0 minor:89 fsType:overlay blockSize:0} overlay_0-90:{mountpoint:/var/lib/containers/storage/overlay/a4d74d449eb8015decd4906807ed91dc149b09e6f3a793554e762230a3365416/merged major:0 minor:90 fsType:overlay blockSize:0} overlay_0-903:{mountpoint:/var/lib/containers/storage/overlay/09e6b30d084e3e46ad062878fb74060a27022d4c1dfcc69912cf16d75a864957/merged major:0 minor:903 fsType:overlay blockSize:0} overlay_0-906:{mountpoint:/var/lib/containers/storage/overlay/2768eea3617d8d024f35db5c0b6aaa01296bcd4691cfeafeef1e0fa05b42d77f/merged major:0 minor:906 fsType:overlay blockSize:0} overlay_0-908:{mountpoint:/var/lib/containers/storage/overlay/a12c29de0e43e3cc5cfac4aae44ab019cd31d3d2ab44e53b80c24dbfde2ba815/merged major:0 minor:908 fsType:overlay blockSize:0} overlay_0-914:{mountpoint:/var/lib/containers/storage/overlay/e024162f30a56f91664f9af92296c67b5ad25dac41d0021daa4b521758c92147/merged major:0 minor:914 fsType:overlay blockSize:0} overlay_0-93:{mountpoint:/var/lib/containers/storage/overlay/ee0ed21a7ab1bc6ec912252a7962910f892c7b76f859e270e4932d5d09de3560/merged major:0 minor:93 fsType:overlay blockSize:0} overlay_0-95:{mountpoint:/var/lib/containers/storage/overlay/0a2cda3abf88e1b75bd9401ad0f40f0b4996eca42bb2b1b83cd82ddce4e3ba15/merged major:0 minor:95 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/39a86cfedcfb4fe3dfdf1d3171cf14f7fcccf54b622227e103a534760281b09a/merged major:0 minor:96 fsType:overlay blockSize:0}] Mar 12 12:24:27.917660 master-0 kubenswrapper[13984]: I0312 12:24:27.916739 13984 manager.go:217] Machine: {Timestamp:2026-03-12 12:24:27.915688245 +0000 UTC m=+0.113703757 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:2fe2bf4a51c343709ee1f99d3a96e3ea SystemUUID:2fe2bf4a-51c3-4370-9ee1-f99d3a96e3ea BootID:473a10fb-d3cc-4f1f-a79c-240b5ee16b09 Filesystems:[{Device:overlay_0-713 DeviceMajor:0 DeviceMinor:713 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:421 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/68c57a64-f30c-4caf-89ef-08bd0d36833e/volumes/kubernetes.io~projected/kube-api-access-x4nbb DeviceMajor:0 DeviceMinor:783 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-845 DeviceMajor:0 DeviceMinor:845 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:570 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-586 DeviceMajor:0 DeviceMinor:586 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b320b1f710b0d644ecef2c2ad1bae1650b0f603989d0eba39702b2f48e918747/userdata/shm DeviceMajor:0 DeviceMinor:603 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~projected/kube-api-access-94src DeviceMajor:0 DeviceMinor:779 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f3b704e7-1291-4645-8a0d-2a937829d7ac/volumes/kubernetes.io~projected/kube-api-access-zct7x DeviceMajor:0 DeviceMinor:753 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a73570b27cdf14004fcfc5c69eadd2063cb040486e8ff1a903a8a1c41a806cb5/userdata/shm DeviceMajor:0 DeviceMinor:784 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-818 DeviceMajor:0 DeviceMinor:818 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-85 DeviceMajor:0 DeviceMinor:85 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2860b265a556fe93cc79d001d83d971ba4d1223844dca9c9d4f423b151e14d7f/userdata/shm DeviceMajor:0 DeviceMinor:267 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-618 DeviceMajor:0 DeviceMinor:618 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:751 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/99a11fe6-48a1-439e-b788-158dbe267dcd/volumes/kubernetes.io~projected/kube-api-access-ns72p DeviceMajor:0 DeviceMinor:781 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/666857a1-0ddf-4b48-91f4-44cce154d1b1/volumes/kubernetes.io~projected/kube-api-access-vrqx7 DeviceMajor:0 DeviceMinor:92 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-499 DeviceMajor:0 DeviceMinor:499 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:383 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/632651f7-6641-49d8-9c48-7f6ea5846538/volumes/kubernetes.io~projected/kube-api-access-86k82 DeviceMajor:0 DeviceMinor:782 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4300be3c5fe59df72fe35edd262229cf307037ab319e70ec8058015a01d299e1/userdata/shm DeviceMajor:0 DeviceMinor:273 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-453 DeviceMajor:0 DeviceMinor:453 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a7cef5c725422fbd2eaaa42e7229e1a33d2367f18d70fb9f0b28995a04b77f89/userdata/shm DeviceMajor:0 DeviceMinor:670 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ce9b1542878c63fdbb66dd146c9a35513cb24753288c84fdf7cbec0ffe06048e/userdata/shm DeviceMajor:0 DeviceMinor:126 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~projected/kube-api-access-6l57v DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bfec511e21bbf0dfd869b19d5fa48c209028875c1e502c22cc890b2268398c69/userdata/shm DeviceMajor:0 DeviceMinor:662 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-77 DeviceMajor:0 DeviceMinor:77 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:480 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes/kubernetes.io~projected/kube-api-access-xggp6 DeviceMajor:0 DeviceMinor:462 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-494 DeviceMajor:0 DeviceMinor:494 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-413 DeviceMajor:0 DeviceMinor:413 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8/userdata/shm DeviceMajor:0 DeviceMinor:632 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-324 DeviceMajor:0 DeviceMinor:324 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:overlay_0-139 DeviceMajor:0 DeviceMinor:139 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~projected/kube-api-access-x8mvz DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-532 DeviceMajor:0 DeviceMinor:532 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-51 DeviceMajor:0 DeviceMinor:51 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-378 DeviceMajor:0 DeviceMinor:378 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/volumes/kubernetes.io~projected/kube-api-access-wvrpf DeviceMajor:0 DeviceMinor:370 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:601 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-719 DeviceMajor:0 DeviceMinor:719 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-326 DeviceMajor:0 DeviceMinor:326 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-743 DeviceMajor:0 DeviceMinor:743 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/15bf86d9-62b3-4af8-b6f6-23131d712332/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:386 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-689 DeviceMajor:0 DeviceMinor:689 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:647 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-786 DeviceMajor:0 DeviceMinor:786 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ad38671dc0e01021059e694690e391eb7e5f662e10f37c5a4ec0d76fc36b9929/userdata/shm DeviceMajor:0 DeviceMinor:803 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326/volumes/kubernetes.io~projected/kube-api-access-xx2c4 DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cebeefbf62ca9404f584042e3cbed22c69cb26efc6632b93a6a7fa2b6a0952e9/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/43388f3fc4030bd546e0a3b05c45ff8414138bbab582c7b9e9531efe462ae9bb/userdata/shm DeviceMajor:0 DeviceMinor:309 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-398 DeviceMajor:0 DeviceMinor:398 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:481 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-356 DeviceMajor:0 DeviceMinor:356 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-59 DeviceMajor:0 DeviceMinor:59 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c4db6563159c41f8567591a440bd8dab86b009b1c6a27aeab29775c822f73bc3/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32/userdata/shm DeviceMajor:0 DeviceMinor:466 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-831 DeviceMajor:0 DeviceMinor:831 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7e3651e4363549a04b82ec26394498f6d4ec7456d1ae54da099d3f3dc779acb3/userdata/shm DeviceMajor:0 DeviceMinor:665 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-680 DeviceMajor:0 DeviceMinor:680 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a9dc508c688ae34bc2eddc6709fa675fb20e54f9d7eb42f1172affaecbda59cc/userdata/shm DeviceMajor:0 DeviceMinor:798 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/269d77d9-815e-4324-8827-1ce429063ed1/volumes/kubernetes.io~projected/kube-api-access-jxt29 DeviceMajor:0 DeviceMinor:303 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:579 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~projected/kube-api-access-dcb5s DeviceMajor:0 DeviceMinor:602 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9/userdata/shm DeviceMajor:0 DeviceMinor:468 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-678 DeviceMajor:0 DeviceMinor:678 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8d2d7e73e7439ba1eb1229f42a2e5a6be0805b87e3874b733e04979b526355b/userdata/shm DeviceMajor:0 DeviceMinor:790 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-815 DeviceMajor:0 DeviceMinor:815 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-149 DeviceMajor:0 DeviceMinor:149 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/54612733-158f-4a92-a1bf-f4a8d653ffaf/volumes/kubernetes.io~projected/kube-api-access-9lnbq DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4e68f4bc560c5df24d0c2f98d45fcf56d54f2e3305701c6372ad15ea940098c4/userdata/shm DeviceMajor:0 DeviceMinor:489 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/99a11fe6-48a1-439e-b788-158dbe267dcd/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:778 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-495 DeviceMajor:0 DeviceMinor:495 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-833 DeviceMajor:0 DeviceMinor:833 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-908 DeviceMajor:0 DeviceMinor:908 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-849 DeviceMajor:0 DeviceMinor:849 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa52915b5c64f27dfbe098d72f59520909f2c72423d01e136d359cccc8cf8e79/userdata/shm DeviceMajor:0 DeviceMinor:114 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/580bafd6-af8c-4961-b959-b736a180e309/volumes/kubernetes.io~projected/kube-api-access-lf74f DeviceMajor:0 DeviceMinor:338 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/aa8ddfdd-7f2d-4fd4-b666-1497dee752df/volumes/kubernetes.io~projected/kube-api-access-6wj8x DeviceMajor:0 DeviceMinor:420 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:600 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f19c3c89-8d32-4394-bd86-e5ef7734c42b/volumes/kubernetes.io~projected/kube-api-access-s28gq DeviceMajor:0 DeviceMinor:770 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c999be46e4c7dd19c23c0225b79825f8ad177bab53b4d5cdac62201c5aa7f539/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-523 DeviceMajor:0 DeviceMinor:523 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-724 DeviceMajor:0 DeviceMinor:724 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-693 DeviceMajor:0 DeviceMinor:693 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-330 DeviceMajor:0 DeviceMinor:330 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/81cb0504-9455-4398-aed1-5cc6790f292e/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:752 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be/userdata/shm DeviceMajor:0 DeviceMinor:52 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-174 DeviceMajor:0 DeviceMinor:174 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cfd178d7-f518-413b-95ab-ab6687be6e0f/volumes/kubernetes.io~projected/kube-api-access-r5xlx DeviceMajor:0 DeviceMinor:248 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5a012d0b-d1a8-4cd3-8b91-b346d0445f24/volumes/kubernetes.io~projected/kube-api-access-9bx48 DeviceMajor:0 DeviceMinor:255 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/64f1dc2d81d717bf0671e44d8f5d716029eb32e5b7d761c1919a015da1e533b6/userdata/shm DeviceMajor:0 DeviceMinor:371 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-841 DeviceMajor:0 DeviceMinor:841 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-847 DeviceMajor:0 DeviceMinor:847 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-400 DeviceMajor:0 DeviceMinor:400 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-434 DeviceMajor:0 DeviceMinor:434 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d7aae39ac0f8cea9552f3bc1f796a0a6a68481f6079d7651593a9da0b4c18f5a/userdata/shm DeviceMajor:0 DeviceMinor:672 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-381 DeviceMajor:0 DeviceMinor:381 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-491 DeviceMajor:0 DeviceMinor:491 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-676 DeviceMajor:0 DeviceMinor:676 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-637 DeviceMajor:0 DeviceMinor:637 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~projected/kube-api-access-pwfct DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2bab9dba-235f-467c-9224-634cca9acbd2/volumes/kubernetes.io~projected/kube-api-access-cv9kn DeviceMajor:0 DeviceMinor:780 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/81cb0504-9455-4398-aed1-5cc6790f292e/volumes/kubernetes.io~projected/kube-api-access-j5jgq DeviceMajor:0 DeviceMinor:754 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4668278e293f25c682097a5e7d9cfb181571373cacd2a15b29dc6c82fc88e8f4/userdata/shm DeviceMajor:0 DeviceMinor:805 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-145 DeviceMajor:0 DeviceMinor:145 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6519636251c6c38ef9d066c61d6299777a9c7b8af3d694989727eb85f1e60cdc/userdata/shm DeviceMajor:0 DeviceMinor:274 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:477 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-550 DeviceMajor:0 DeviceMinor:550 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f6b1422fb985a54d196c8cc057ae0368a92f6fe87e61b3d31c43cf22a41a1666/userdata/shm DeviceMajor:0 DeviceMinor:792 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-825 DeviceMajor:0 DeviceMinor:825 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/55bf535c-93ab-4870-a9d2-c02496d71ef0/volumes/kubernetes.io~projected/kube-api-access-svpvs DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-520 DeviceMajor:0 DeviceMinor:520 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:648 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-67 DeviceMajor:0 DeviceMinor:67 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9d47f860-d64a-49b8-b404-a67cbc2faeb6/volumes/kubernetes.io~projected/kube-api-access-njx9l DeviceMajor:0 DeviceMinor:507 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-592 DeviceMajor:0 DeviceMinor:592 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-721 DeviceMajor:0 DeviceMinor:721 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-46 DeviceMajor:0 DeviceMinor:46 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-322 DeviceMajor:0 DeviceMinor:322 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:479 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-93 DeviceMajor:0 DeviceMinor:93 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5afef1522b2b9d45e7e11a3cf8046e3aa9b0904c684f2bd377cc117b78bd3b81/userdata/shm DeviceMajor:0 DeviceMinor:103 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:435 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2add9d2e5cdf8eafbdab4b7a44808b2e6f41988b57e334321268362b6270eb86/userdata/shm DeviceMajor:0 DeviceMinor:663 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-687 DeviceMajor:0 DeviceMinor:687 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-73 DeviceMajor:0 DeviceMinor:73 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-95 DeviceMajor:0 DeviceMinor:95 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-914 DeviceMajor:0 DeviceMinor:914 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~projected/kube-api-access-kv9fk DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d9d20b5228d0e4a25786b4607cb41d5225d6afeeb2f986364f8b83cf7ffb3587/userdata/shm DeviceMajor:0 DeviceMinor:259 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ed5a074c-e194-4b16-a4c9-0d82830bf7ca/volumes/kubernetes.io~projected/kube-api-access-qhwbb DeviceMajor:0 DeviceMinor:584 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-572 DeviceMajor:0 DeviceMinor:572 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-316 DeviceMajor:0 DeviceMinor:316 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b8ffdf31e610994af2b57b9f88a7795bae0cba26fb04c18d8c2445e3f0680a53/userdata/shm DeviceMajor:0 DeviceMinor:423 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-443 DeviceMajor:0 DeviceMinor:443 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88bba27aba99f4035ab7b0c7f4bc171c7586160f39546ce2e2e9d71e92be1a68/userdata/shm DeviceMajor:0 DeviceMinor:508 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-441 DeviceMajor:0 DeviceMinor:441 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-340 DeviceMajor:0 DeviceMinor:340 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-87 DeviceMajor:0 DeviceMinor:87 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/114b1d16-b37d-449c-84e3-3fb3f8b20eaa/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:470 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2ada9e185ebcd7796052d553750f50a0d8e59b4d4c070029c61ef23cd75f5a22/userdata/shm DeviceMajor:0 DeviceMinor:808 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-315 DeviceMajor:0 DeviceMinor:315 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~projected/kube-api-access-m9gmt DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:661 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-739 DeviceMajor:0 DeviceMinor:739 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:514 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:775 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-105 DeviceMajor:0 DeviceMinor:105 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:141 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ab087440-bdf2-4e2f-9a5a-434d50a2329a/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-516 DeviceMajor:0 DeviceMinor:516 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-334 DeviceMajor:0 DeviceMinor:334 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-794 DeviceMajor:0 DeviceMinor:794 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df/userdata/shm DeviceMajor:0 DeviceMinor:61 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-43 DeviceMajor:0 DeviceMinor:43 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~projected/kube-api-access-gd645 DeviceMajor:0 DeviceMinor:133 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a31234df882122d07e602b31ef9412c0c41c548e3f6b29cd809ca5a3f68cae28/userdata/shm DeviceMajor:0 DeviceMinor:483 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-497 DeviceMajor:0 DeviceMinor:497 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6/userdata/shm DeviceMajor:0 DeviceMinor:84 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-427 DeviceMajor:0 DeviceMinor:427 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/48802dfcb8b0bd5f567eee89b3f4b5a9769fbf82dea70b650fc097ec0ea21366/userdata/shm DeviceMajor:0 DeviceMinor:800 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9576089e2647ba9a9f5df3c572d8e7de7b7129020d7e31900e7d5c8dd8366e64/userdata/shm DeviceMajor:0 DeviceMinor:254 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:504 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f3b704e7-1291-4645-8a0d-2a937829d7ac/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:731 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/57810d434673756bb33355876f6921aff74c6f281e31a4bda0e7128f4df78dd1/userdata/shm DeviceMajor:0 DeviceMinor:484 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:505 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes/kubernetes.io~projected/kube-api-access-btbt7 DeviceMajor:0 DeviceMinor:402 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2bab9dba-235f-467c-9224-634cca9acbd2/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:776 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c873b656-d2aa-4d0e-aa22-9f8d35186473/volumes/kubernetes.io~projected/kube-api-access-kgxbv DeviceMajor:0 DeviceMinor:515 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-558 DeviceMajor:0 DeviceMinor:558 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c62edaec-38e2-4b73-8bb5-c776abfb310f/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:709 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-594 DeviceMajor:0 DeviceMinor:594 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-626 DeviceMajor:0 DeviceMinor:626 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a154f648-b96d-449e-b0f5-ba32266000c2/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/aa8ddfdd-7f2d-4fd4-b666-1497dee752df/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:415 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e6032cd6fa69cd106d479f475c74d70d0b23b0584227d89843773716e915d757/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:478 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/91dfd404945dfa72b72f5ee55329cfd84189e41ee3e0a4c889c8d9f86c69e940/userdata/shm DeviceMajor:0 DeviceMinor:666 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-147 DeviceMajor:0 DeviceMinor:147 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~projected/kube-api-access-xmlzw DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a346ac54-02fe-417f-a49d-038e45b13a1d/volumes/kubernetes.io~projected/kube-api-access-9dhfq DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-717 DeviceMajor:0 DeviceMinor:717 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-859 DeviceMajor:0 DeviceMinor:859 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-843 DeviceMajor:0 DeviceMinor:843 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b5ba5b90d6ae03cfb78b058e1659ce46a060105bfebabaafdc41fb8977ded8b7/userdata/shm DeviceMajor:0 DeviceMinor:521 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-528 DeviceMajor:0 DeviceMinor:528 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9bc7dea3-1868-488c-a34b-288cde3acd35/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:659 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-250 DeviceMajor:0 DeviceMinor:250 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-411 DeviceMajor:0 DeviceMinor:411 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-735 DeviceMajor:0 DeviceMinor:735 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-455 DeviceMajor:0 DeviceMinor:455 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-530 DeviceMajor:0 DeviceMinor:530 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/437f19dee9aaecaf0da0f9a4e2b870d07ad9f109bec28fd8d9b704649289798a/userdata/shm DeviceMajor:0 DeviceMinor:802 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-817 DeviceMajor:0 DeviceMinor:817 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/61ab511b-72e9-4fb9-b5de-770f49514369/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:101 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~projected/kube-api-access-jcltq DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b5890f0c-cebe-4788-89f7-27568d875741/volumes/kubernetes.io~projected/kube-api-access-fqcrz DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-906 DeviceMajor:0 DeviceMinor:906 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:660 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-75 DeviceMajor:0 DeviceMinor:75 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-130 DeviceMajor:0 DeviceMinor:130 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/51d58450-50bb-4da0-b1f6-4135fbabd856/volumes/kubernetes.io~projected/kube-api-access-wg27g DeviceMajor:0 DeviceMinor:142 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/720101f1-0833-45af-a5b7-4910ece2a589/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:596 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ae2269d7-f11f-46d1-95e7-f89a70ee1152/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:657 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-684 DeviceMajor:0 DeviceMinor:684 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-317 DeviceMajor:0 DeviceMinor:317 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-556 DeviceMajor:0 DeviceMinor:556 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d1d16bbc-778b-4fc1-abb2-b43e79a7c532/volumes/kubernetes.io~projected/kube-api-access-jq9d5 DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b9194868-75ce-4138-a9d4-ddd64660c529/volumes/kubernetes.io~projected/kube-api-access-s4r7v DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9ed5d4e2f5f9d30b72c838a61c437059eadf2c2990c7efa7688884bd954ef475/userdata/shm DeviceMajor:0 DeviceMinor:392 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ded4724d743688b7920664606d22270128172d27c82b29e81d8b50ac01a66fa9/userdata/shm DeviceMajor:0 DeviceMinor:439 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-90 DeviceMajor:0 DeviceMinor:90 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-749 DeviceMajor:0 DeviceMinor:749 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-903 DeviceMajor:0 DeviceMinor:903 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-128 DeviceMajor:0 DeviceMinor:128 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8a121d0d-d201-446b-97a1-e2414e599f4a/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:253 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-332 DeviceMajor:0 DeviceMinor:332 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-810 DeviceMajor:0 DeviceMinor:810 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-79 DeviceMajor:0 DeviceMinor:79 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e19aac61221800c39184f69358de8f3811cd80c87d61a378f35e2e0a8d91765c/userdata/shm DeviceMajor:0 DeviceMinor:245 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2d15b9b0e60b6b33d755ed7f9384d53ec5286f5e0e424707fd880f8d36ba63ba/userdata/shm DeviceMajor:0 DeviceMinor:257 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-396 DeviceMajor:0 DeviceMinor:396 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-511 DeviceMajor:0 DeviceMinor:511 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-562 DeviceMajor:0 DeviceMinor:562 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/70d56295fb2db27f91085477bc0d85a5a75eec054bb4637a7df1876812db41e7/userdata/shm DeviceMajor:0 DeviceMinor:674 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f19c3c89-8d32-4394-bd86-e5ef7734c42b/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:551 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/632651f7-6641-49d8-9c48-7f6ea5846538/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:777 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cd26eb3215c3045fef95e54a836428d267922b388ae43b971d77bc3784523536/userdata/shm DeviceMajor:0 DeviceMinor:791 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ea80247e-b4dd-45dc-8255-6e68508c8480/volumes/kubernetes.io~projected/kube-api-access-xbztv DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/volumes/kubernetes.io~projected/kube-api-access-6flvz DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:658 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-682 DeviceMajor:0 DeviceMinor:682 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/745f17315daebf90a850c4e46eaa552a044dd14adccfee57d001e708ed385cdf/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/36852fda-6aee-4a36-8724-537f1260c4c8/volumes/kubernetes.io~projected/kube-api-access-54547 DeviceMajor:0 DeviceMinor:510 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1faac8f245695a0b4273e53303a303c00cf33c7f391f25e9c09fc9c6b457b1b5/userdata/shm DeviceMajor:0 DeviceMinor:304 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/volumes/kubernetes.io~projected/kube-api-access-tb249 DeviceMajor:0 DeviceMinor:771 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-555 DeviceMajor:0 DeviceMinor:555 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3b72e72a154930e6638978b9d96d1fb9f48ad70245754bc46fa997ab9b768457/userdata/shm DeviceMajor:0 DeviceMinor:473 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:774 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fd48f85cdff86dca5fd974e33c046e44989ec396616456ae028b5495072f5b8b/userdata/shm DeviceMajor:0 DeviceMinor:143 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0aeeef2a-f9df-4f87-b985-bd1da94c76c3/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-269 DeviceMajor:0 DeviceMinor:269 Capacity:21414331 Mar 12 12:24:27.918176 master-0 kubenswrapper[13984]: 5968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/700144f71aaf4edd85bba4cceda2dd6a1013711fa671c636535cf91528c721c3/userdata/shm DeviceMajor:0 DeviceMinor:487 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/022dd526-0ea5-4224-9d2e-778ed4ef8a56/volumes/kubernetes.io~projected/kube-api-access-pxrnf DeviceMajor:0 DeviceMinor:422 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-449 DeviceMajor:0 DeviceMinor:449 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-605 DeviceMajor:0 DeviceMinor:605 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/114b1d16-b37d-449c-84e3-3fb3f8b20eaa/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:471 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e64bc838-280e-4231-9732-1adb69fed0bc/volumes/kubernetes.io~projected/kube-api-access-tq5c7 DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a760fef0730108276c12f0e7e65889ee4d7455d96d4c3e35ad89035b139d417/userdata/shm DeviceMajor:0 DeviceMinor:263 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/15bf86d9-62b3-4af8-b6f6-23131d712332/volumes/kubernetes.io~projected/kube-api-access-g7tct DeviceMajor:0 DeviceMinor:391 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-482 DeviceMajor:0 DeviceMinor:482 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/10498208-0692-4533-b672-a7a2cfcdf1be/volumes/kubernetes.io~projected/kube-api-access-xdwfl DeviceMajor:0 DeviceMinor:118 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f04121eb-5c7b-42cd-a2e2-26cf1c67593d/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:132 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/c62edaec-38e2-4b73-8bb5-c776abfb310f/volumes/kubernetes.io~projected/kube-api-access-cdfnh DeviceMajor:0 DeviceMinor:710 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-748 DeviceMajor:0 DeviceMinor:748 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/68c57a64-f30c-4caf-89ef-08bd0d36833e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:773 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-741 DeviceMajor:0 DeviceMinor:741 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/02d5a507-4409-44b4-98bc-1751cdcc6c6a/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:763 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-788 DeviceMajor:0 DeviceMinor:788 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-623 DeviceMajor:0 DeviceMinor:623 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2f3a291a-d9af-4e0f-a307-8928e4dc523d/volumes/kubernetes.io~projected/kube-api-access-b2f6r DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d961a5f0-84b7-47d7-846b-238475947121/volumes/kubernetes.io~projected/kube-api-access-zdvf6 DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9d47f860-d64a-49b8-b404-a67cbc2faeb6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:506 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-355 DeviceMajor:0 DeviceMinor:355 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/18fbe767c6fc6878f34a3f3c362845f142e2f102bf12b38c75ce6966ee2eee2a/userdata/shm DeviceMajor:0 DeviceMinor:134 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3c02552c-a477-4c6c-8a45-2fdc758c084b/volumes/kubernetes.io~projected/kube-api-access-p97xk DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9093184a94434d195f504d748e233081459211af66933b52ecc618c767700a31/userdata/shm DeviceMajor:0 DeviceMinor:393 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-561 DeviceMajor:0 DeviceMinor:561 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-475 DeviceMajor:0 DeviceMinor:475 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/02d5a507-4409-44b4-98bc-1751cdcc6c6a/volumes/kubernetes.io~projected/kube-api-access-jd4jz DeviceMajor:0 DeviceMinor:772 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5ea8db4b234956cfb44c55bf4aa2d228c2ba7b9a44ddc7601a2ade186535532c/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a22189f2-3f35-4ea6-9892-39a1b46637e2/volumes/kubernetes.io~projected/kube-api-access-ft5sd DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1588984bdf65a943d367a0cf55fa9d9d29fbe3f46cc19ebf7ccc83dd17e2e7b7/userdata/shm DeviceMajor:0 DeviceMinor:513 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:107 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9b960fe2-d59e-4ee1-bd9d-455b46753cb9/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:252 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-835 DeviceMajor:0 DeviceMinor:835 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:1faac8f245695a0 MacAddress:b6:d0:3f:5e:d8:18 Speed:10000 Mtu:8900} {Name:2860b265a556fe9 MacAddress:e6:c1:67:b2:79:ea Speed:10000 Mtu:8900} {Name:2ada9e185ebcd77 MacAddress:5e:72:c1:10:f6:87 Speed:10000 Mtu:8900} {Name:2add9d2e5cdf8ea MacAddress:3a:da:b5:ab:70:95 Speed:10000 Mtu:8900} {Name:2d15b9b0e60b6b3 MacAddress:ce:c3:f0:12:75:db Speed:10000 Mtu:8900} {Name:4300be3c5fe59df MacAddress:ce:d8:bb:7c:27:31 Speed:10000 Mtu:8900} {Name:43388f3fc4030bd MacAddress:ea:ed:08:b7:68:89 Speed:10000 Mtu:8900} {Name:437f19dee9aaeca MacAddress:7a:e0:26:78:fc:44 Speed:10000 Mtu:8900} {Name:4668278e293f25c MacAddress:72:96:bb:82:da:76 Speed:10000 Mtu:8900} {Name:48802dfcb8b0bd5 MacAddress:4a:63:cc:2b:95:6d Speed:10000 Mtu:8900} {Name:4e68f4bc560c5df MacAddress:fe:dc:07:18:ab:07 Speed:10000 Mtu:8900} {Name:57810d434673756 MacAddress:2e:61:52:f9:5a:7d Speed:10000 Mtu:8900} {Name:64f1dc2d81d717b MacAddress:f6:3b:f2:5c:2e:36 Speed:10000 Mtu:8900} {Name:6519636251c6c38 MacAddress:82:37:e8:5f:c7:a3 Speed:10000 Mtu:8900} {Name:700144f71aaf4ed MacAddress:3e:a9:84:37:3c:d6 Speed:10000 Mtu:8900} {Name:70d56295fb2db27 MacAddress:e2:40:95:b7:dd:02 Speed:10000 Mtu:8900} {Name:745f17315daebf9 MacAddress:1e:9b:73:a4:06:32 Speed:10000 Mtu:8900} {Name:76740ed254dde85 MacAddress:72:80:2b:3a:a6:f6 Speed:10000 Mtu:8900} {Name:7e3651e4363549a MacAddress:2a:23:eb:bc:a7:f6 Speed:10000 Mtu:8900} {Name:88bba27aba99f40 MacAddress:de:06:69:d2:71:7a Speed:10000 Mtu:8900} {Name:8a760fef0730108 MacAddress:de:84:a2:04:65:37 Speed:10000 Mtu:8900} {Name:9093184a94434d1 MacAddress:ea:f0:56:df:5a:b8 Speed:10000 Mtu:8900} {Name:91dfd404945dfa7 MacAddress:b2:45:b5:84:5a:3a Speed:10000 Mtu:8900} {Name:9576089e2647ba9 MacAddress:3e:7f:04:69:1b:62 Speed:10000 Mtu:8900} {Name:9ed5d4e2f5f9d30 MacAddress:fa:03:58:c4:9a:89 Speed:10000 Mtu:8900} {Name:a31234df882122d MacAddress:da:37:1a:2f:a1:20 Speed:10000 Mtu:8900} {Name:a7cef5c725422fb MacAddress:6a:6c:c7:e1:c3:1d Speed:10000 Mtu:8900} {Name:a9dc508c688ae34 MacAddress:c6:70:a6:ba:e7:12 Speed:10000 Mtu:8900} {Name:ad38671dc0e0102 MacAddress:7a:f2:94:fc:4c:80 Speed:10000 Mtu:8900} {Name:b320b1f710b0d64 MacAddress:d2:59:41:36:4d:d7 Speed:10000 Mtu:8900} {Name:b5ba5b90d6ae03c MacAddress:ca:e1:3c:32:26:ca Speed:10000 Mtu:8900} {Name:b8ffdf31e610994 MacAddress:ca:95:e9:da:f4:0a Speed:10000 Mtu:8900} {Name:bfec511e21bbf0d MacAddress:76:ea:18:e5:47:cb Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:02:31:d3:8a:3a:c8 Speed:0 Mtu:8900} {Name:c4db6563159c41f MacAddress:46:80:36:f7:ae:0a Speed:10000 Mtu:8900} {Name:c999be46e4c7dd1 MacAddress:ba:8b:5e:14:c9:5c Speed:10000 Mtu:8900} {Name:cd26eb3215c3045 MacAddress:2e:56:08:b6:c9:0f Speed:10000 Mtu:8900} {Name:cebeefbf62ca940 MacAddress:76:0c:14:65:20:6e Speed:10000 Mtu:8900} {Name:d7aae39ac0f8cea MacAddress:16:e6:8d:44:2b:2c Speed:10000 Mtu:8900} {Name:d9d20b5228d0e4a MacAddress:6e:7a:6f:35:0e:6c Speed:10000 Mtu:8900} {Name:de8fe1d3e9190ca MacAddress:9a:0e:57:13:c6:92 Speed:10000 Mtu:8900} {Name:ded4724d743688b MacAddress:66:01:66:c5:ac:be Speed:10000 Mtu:8900} {Name:e19aac61221800c MacAddress:2e:cd:23:30:20:1f Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:30:db:e2 Speed:-1 Mtu:9000} {Name:eth2 MacAddress:fa:16:3e:e7:70:da Speed:-1 Mtu:9000} {Name:f6b1422fb985a54 MacAddress:0e:16:87:70:50:71 Speed:10000 Mtu:8900} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:82:08:64:40:9a:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 12 12:24:27.918176 master-0 kubenswrapper[13984]: I0312 12:24:27.917643 13984 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 12 12:24:27.918176 master-0 kubenswrapper[13984]: I0312 12:24:27.917702 13984 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 12 12:24:27.918176 master-0 kubenswrapper[13984]: I0312 12:24:27.918040 13984 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 12 12:24:27.918176 master-0 kubenswrapper[13984]: I0312 12:24:27.918156 13984 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 12 12:24:27.918433 master-0 kubenswrapper[13984]: I0312 12:24:27.918186 13984 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 12 12:24:27.918433 master-0 kubenswrapper[13984]: I0312 12:24:27.918389 13984 topology_manager.go:138] "Creating topology manager with none policy" Mar 12 12:24:27.918433 master-0 kubenswrapper[13984]: I0312 12:24:27.918397 13984 container_manager_linux.go:303] "Creating device plugin manager" Mar 12 12:24:27.918433 master-0 kubenswrapper[13984]: I0312 12:24:27.918406 13984 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 12:24:27.918433 master-0 kubenswrapper[13984]: I0312 12:24:27.918426 13984 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 12 12:24:27.918600 master-0 kubenswrapper[13984]: I0312 12:24:27.918457 13984 state_mem.go:36] "Initialized new in-memory state store" Mar 12 12:24:27.918600 master-0 kubenswrapper[13984]: I0312 12:24:27.918552 13984 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 12 12:24:27.918653 master-0 kubenswrapper[13984]: I0312 12:24:27.918605 13984 kubelet.go:418] "Attempting to sync node with API server" Mar 12 12:24:27.918653 master-0 kubenswrapper[13984]: I0312 12:24:27.918617 13984 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 12 12:24:27.918653 master-0 kubenswrapper[13984]: I0312 12:24:27.918629 13984 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 12 12:24:27.918653 master-0 kubenswrapper[13984]: I0312 12:24:27.918639 13984 kubelet.go:324] "Adding apiserver pod source" Mar 12 12:24:27.918653 master-0 kubenswrapper[13984]: I0312 12:24:27.918654 13984 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 12 12:24:27.920389 master-0 kubenswrapper[13984]: I0312 12:24:27.919769 13984 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 12 12:24:27.920389 master-0 kubenswrapper[13984]: I0312 12:24:27.920062 13984 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 12 12:24:27.920389 master-0 kubenswrapper[13984]: I0312 12:24:27.920327 13984 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 12 12:24:27.922230 master-0 kubenswrapper[13984]: I0312 12:24:27.922184 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 12 12:24:27.922230 master-0 kubenswrapper[13984]: I0312 12:24:27.922227 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 12 12:24:27.922230 master-0 kubenswrapper[13984]: I0312 12:24:27.922256 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 12 12:24:27.922230 master-0 kubenswrapper[13984]: I0312 12:24:27.922265 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922275 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922285 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922296 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922304 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922320 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922326 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922337 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922349 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 12 12:24:27.922404 master-0 kubenswrapper[13984]: I0312 12:24:27.922381 13984 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 12 12:24:27.923079 master-0 kubenswrapper[13984]: I0312 12:24:27.923049 13984 server.go:1280] "Started kubelet" Mar 12 12:24:27.923867 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 12 12:24:27.924011 master-0 kubenswrapper[13984]: I0312 12:24:27.923972 13984 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 12 12:24:27.925215 master-0 kubenswrapper[13984]: I0312 12:24:27.925071 13984 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 12 12:24:27.925422 master-0 kubenswrapper[13984]: I0312 12:24:27.925191 13984 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 12 12:24:27.926234 master-0 kubenswrapper[13984]: I0312 12:24:27.926192 13984 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 12 12:24:27.932800 master-0 kubenswrapper[13984]: I0312 12:24:27.932690 13984 server.go:449] "Adding debug handlers to kubelet server" Mar 12 12:24:27.936248 master-0 kubenswrapper[13984]: E0312 12:24:27.936217 13984 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 12 12:24:27.937153 master-0 kubenswrapper[13984]: I0312 12:24:27.937130 13984 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 12 12:24:27.937260 master-0 kubenswrapper[13984]: I0312 12:24:27.937246 13984 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 12 12:24:27.937500 master-0 kubenswrapper[13984]: I0312 12:24:27.937269 13984 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-13 12:11:16 +0000 UTC, rotation deadline is 2026-03-13 09:45:16.572812175 +0000 UTC Mar 12 12:24:27.937500 master-0 kubenswrapper[13984]: I0312 12:24:27.937497 13984 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 12 12:24:27.937611 master-0 kubenswrapper[13984]: I0312 12:24:27.937498 13984 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 21h20m48.635318515s for next certificate rotation Mar 12 12:24:27.937611 master-0 kubenswrapper[13984]: I0312 12:24:27.937444 13984 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 12 12:24:27.937611 master-0 kubenswrapper[13984]: I0312 12:24:27.937551 13984 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 12 12:24:27.937731 master-0 kubenswrapper[13984]: E0312 12:24:27.937547 13984 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:24:27.938216 master-0 kubenswrapper[13984]: I0312 12:24:27.938177 13984 factory.go:55] Registering systemd factory Mar 12 12:24:27.938216 master-0 kubenswrapper[13984]: I0312 12:24:27.938206 13984 factory.go:221] Registration of the systemd container factory successfully Mar 12 12:24:27.938511 master-0 kubenswrapper[13984]: I0312 12:24:27.938490 13984 factory.go:153] Registering CRI-O factory Mar 12 12:24:27.938728 master-0 kubenswrapper[13984]: I0312 12:24:27.938509 13984 factory.go:221] Registration of the crio container factory successfully Mar 12 12:24:27.938805 master-0 kubenswrapper[13984]: I0312 12:24:27.938790 13984 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 12 12:24:27.938855 master-0 kubenswrapper[13984]: I0312 12:24:27.938813 13984 factory.go:103] Registering Raw factory Mar 12 12:24:27.938855 master-0 kubenswrapper[13984]: I0312 12:24:27.938828 13984 manager.go:1196] Started watching for new ooms in manager Mar 12 12:24:27.946979 master-0 kubenswrapper[13984]: I0312 12:24:27.943174 13984 manager.go:319] Starting recovery of all containers Mar 12 12:24:27.947591 master-0 kubenswrapper[13984]: I0312 12:24:27.947463 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81cb0504-9455-4398-aed1-5cc6790f292e" volumeName="kubernetes.io/projected/81cb0504-9455-4398-aed1-5cc6790f292e-kube-api-access-j5jgq" seLinuxMountContext="" Mar 12 12:24:27.947698 master-0 kubenswrapper[13984]: I0312 12:24:27.947680 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99a11fe6-48a1-439e-b788-158dbe267dcd" volumeName="kubernetes.io/secret/99a11fe6-48a1-439e-b788-158dbe267dcd-proxy-tls" seLinuxMountContext="" Mar 12 12:24:27.947779 master-0 kubenswrapper[13984]: I0312 12:24:27.947760 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa8ddfdd-7f2d-4fd4-b666-1497dee752df" volumeName="kubernetes.io/empty-dir/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-cache" seLinuxMountContext="" Mar 12 12:24:27.947851 master-0 kubenswrapper[13984]: I0312 12:24:27.947838 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1d16bbc-778b-4fc1-abb2-b43e79a7c532" volumeName="kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5" seLinuxMountContext="" Mar 12 12:24:27.947916 master-0 kubenswrapper[13984]: I0312 12:24:27.947905 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" volumeName="kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config" seLinuxMountContext="" Mar 12 12:24:27.947976 master-0 kubenswrapper[13984]: I0312 12:24:27.947965 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" volumeName="kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.948033 master-0 kubenswrapper[13984]: I0312 12:24:27.948020 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy" seLinuxMountContext="" Mar 12 12:24:27.948092 master-0 kubenswrapper[13984]: I0312 12:24:27.948080 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides" seLinuxMountContext="" Mar 12 12:24:27.948150 master-0 kubenswrapper[13984]: I0312 12:24:27.948139 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c02552c-a477-4c6c-8a45-2fdc758c084b" volumeName="kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk" seLinuxMountContext="" Mar 12 12:24:27.948211 master-0 kubenswrapper[13984]: I0312 12:24:27.948200 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" volumeName="kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config" seLinuxMountContext="" Mar 12 12:24:27.948270 master-0 kubenswrapper[13984]: I0312 12:24:27.948260 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5890f0c-cebe-4788-89f7-27568d875741" volumeName="kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz" seLinuxMountContext="" Mar 12 12:24:27.948326 master-0 kubenswrapper[13984]: I0312 12:24:27.948316 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645" seLinuxMountContext="" Mar 12 12:24:27.948386 master-0 kubenswrapper[13984]: I0312 12:24:27.948375 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="022dd526-0ea5-4224-9d2e-778ed4ef8a56" volumeName="kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-kube-api-access-pxrnf" seLinuxMountContext="" Mar 12 12:24:27.948448 master-0 kubenswrapper[13984]: I0312 12:24:27.948434 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="632651f7-6641-49d8-9c48-7f6ea5846538" volumeName="kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config" seLinuxMountContext="" Mar 12 12:24:27.948536 master-0 kubenswrapper[13984]: I0312 12:24:27.948521 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="632651f7-6641-49d8-9c48-7f6ea5846538" volumeName="kubernetes.io/secret/632651f7-6641-49d8-9c48-7f6ea5846538-cert" seLinuxMountContext="" Mar 12 12:24:27.948625 master-0 kubenswrapper[13984]: I0312 12:24:27.948609 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls" seLinuxMountContext="" Mar 12 12:24:27.948705 master-0 kubenswrapper[13984]: I0312 12:24:27.948687 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa8ddfdd-7f2d-4fd4-b666-1497dee752df" volumeName="kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-kube-api-access-6wj8x" seLinuxMountContext="" Mar 12 12:24:27.948780 master-0 kubenswrapper[13984]: I0312 12:24:27.948765 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bab9dba-235f-467c-9224-634cca9acbd2" volumeName="kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-config" seLinuxMountContext="" Mar 12 12:24:27.948864 master-0 kubenswrapper[13984]: I0312 12:24:27.948849 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55bf535c-93ab-4870-a9d2-c02496d71ef0" volumeName="kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.948949 master-0 kubenswrapper[13984]: I0312 12:24:27.948933 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61ab511b-72e9-4fb9-b5de-770f49514369" volumeName="kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls" seLinuxMountContext="" Mar 12 12:24:27.949034 master-0 kubenswrapper[13984]: I0312 12:24:27.949019 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-encryption-config" seLinuxMountContext="" Mar 12 12:24:27.949115 master-0 kubenswrapper[13984]: I0312 12:24:27.949099 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" volumeName="kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-config" seLinuxMountContext="" Mar 12 12:24:27.949194 master-0 kubenswrapper[13984]: I0312 12:24:27.949175 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" volumeName="kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.949277 master-0 kubenswrapper[13984]: I0312 12:24:27.949261 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" volumeName="kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles" seLinuxMountContext="" Mar 12 12:24:27.949357 master-0 kubenswrapper[13984]: I0312 12:24:27.949342 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="269d77d9-815e-4324-8827-1ce429063ed1" volumeName="kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29" seLinuxMountContext="" Mar 12 12:24:27.949430 master-0 kubenswrapper[13984]: I0312 12:24:27.949415 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15bf86d9-62b3-4af8-b6f6-23131d712332" volumeName="kubernetes.io/configmap/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-cabundle" seLinuxMountContext="" Mar 12 12:24:27.949533 master-0 kubenswrapper[13984]: I0312 12:24:27.949516 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54612733-158f-4a92-a1bf-f4a8d653ffaf" volumeName="kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq" seLinuxMountContext="" Mar 12 12:24:27.949673 master-0 kubenswrapper[13984]: I0312 12:24:27.949629 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.949753 master-0 kubenswrapper[13984]: I0312 12:24:27.949738 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a154f648-b96d-449e-b0f5-ba32266000c2" volumeName="kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz" seLinuxMountContext="" Mar 12 12:24:27.949835 master-0 kubenswrapper[13984]: I0312 12:24:27.949818 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca" seLinuxMountContext="" Mar 12 12:24:27.949920 master-0 kubenswrapper[13984]: I0312 12:24:27.949905 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="022dd526-0ea5-4224-9d2e-778ed4ef8a56" volumeName="kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-ca-certs" seLinuxMountContext="" Mar 12 12:24:27.950012 master-0 kubenswrapper[13984]: I0312 12:24:27.949996 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides" seLinuxMountContext="" Mar 12 12:24:27.950089 master-0 kubenswrapper[13984]: I0312 12:24:27.950074 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68c57a64-f30c-4caf-89ef-08bd0d36833e" volumeName="kubernetes.io/empty-dir/68c57a64-f30c-4caf-89ef-08bd0d36833e-snapshots" seLinuxMountContext="" Mar 12 12:24:27.950180 master-0 kubenswrapper[13984]: I0312 12:24:27.950165 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74d06933-afab-43a3-a1d3-88a569178d34" volumeName="kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq" seLinuxMountContext="" Mar 12 12:24:27.950273 master-0 kubenswrapper[13984]: I0312 12:24:27.950258 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" volumeName="kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access" seLinuxMountContext="" Mar 12 12:24:27.950352 master-0 kubenswrapper[13984]: I0312 12:24:27.950337 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-image-import-ca" seLinuxMountContext="" Mar 12 12:24:27.950436 master-0 kubenswrapper[13984]: I0312 12:24:27.950421 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" volumeName="kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config" seLinuxMountContext="" Mar 12 12:24:27.950537 master-0 kubenswrapper[13984]: I0312 12:24:27.950516 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" volumeName="kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.950626 master-0 kubenswrapper[13984]: I0312 12:24:27.950611 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r" seLinuxMountContext="" Mar 12 12:24:27.950701 master-0 kubenswrapper[13984]: I0312 12:24:27.950688 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c02552c-a477-4c6c-8a45-2fdc758c084b" volumeName="kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics" seLinuxMountContext="" Mar 12 12:24:27.950778 master-0 kubenswrapper[13984]: I0312 12:24:27.950761 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" volumeName="kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access" seLinuxMountContext="" Mar 12 12:24:27.950940 master-0 kubenswrapper[13984]: I0312 12:24:27.950923 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" volumeName="kubernetes.io/projected/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-kube-api-access-94src" seLinuxMountContext="" Mar 12 12:24:27.951025 master-0 kubenswrapper[13984]: I0312 12:24:27.951010 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap" seLinuxMountContext="" Mar 12 12:24:27.951106 master-0 kubenswrapper[13984]: I0312 12:24:27.951091 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bab9dba-235f-467c-9224-634cca9acbd2" volumeName="kubernetes.io/projected/2bab9dba-235f-467c-9224-634cca9acbd2-kube-api-access-cv9kn" seLinuxMountContext="" Mar 12 12:24:27.951188 master-0 kubenswrapper[13984]: I0312 12:24:27.951173 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config" seLinuxMountContext="" Mar 12 12:24:27.951327 master-0 kubenswrapper[13984]: I0312 12:24:27.951311 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="021b22e3-b4c5-426d-b761-181f1e54175d" volumeName="kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6" seLinuxMountContext="" Mar 12 12:24:27.951431 master-0 kubenswrapper[13984]: I0312 12:24:27.951417 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm" seLinuxMountContext="" Mar 12 12:24:27.951641 master-0 kubenswrapper[13984]: I0312 12:24:27.951607 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68c57a64-f30c-4caf-89ef-08bd0d36833e" volumeName="kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-trusted-ca-bundle" seLinuxMountContext="" Mar 12 12:24:27.951745 master-0 kubenswrapper[13984]: I0312 12:24:27.951726 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl" seLinuxMountContext="" Mar 12 12:24:27.951845 master-0 kubenswrapper[13984]: I0312 12:24:27.951830 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" volumeName="kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-images" seLinuxMountContext="" Mar 12 12:24:27.951968 master-0 kubenswrapper[13984]: I0312 12:24:27.951932 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea80247e-b4dd-45dc-8255-6e68508c8480" volumeName="kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config" seLinuxMountContext="" Mar 12 12:24:27.952072 master-0 kubenswrapper[13984]: I0312 12:24:27.952056 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a121d0d-d201-446b-97a1-e2414e599f4a" volumeName="kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.952188 master-0 kubenswrapper[13984]: I0312 12:24:27.952171 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" volumeName="kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4" seLinuxMountContext="" Mar 12 12:24:27.952268 master-0 kubenswrapper[13984]: I0312 12:24:27.952253 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d47f860-d64a-49b8-b404-a67cbc2faeb6" volumeName="kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952328 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5890f0c-cebe-4788-89f7-27568d875741" volumeName="kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952345 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="02d5a507-4409-44b4-98bc-1751cdcc6c6a" volumeName="kubernetes.io/projected/02d5a507-4409-44b4-98bc-1751cdcc6c6a-kube-api-access-jd4jz" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952357 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55bf535c-93ab-4870-a9d2-c02496d71ef0" volumeName="kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952369 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" volumeName="kubernetes.io/projected/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-kube-api-access-tb249" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952381 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" volumeName="kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952393 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952403 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae2269d7-f11f-46d1-95e7-f89a70ee1152" volumeName="kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952413 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f19c3c89-8d32-4394-bd86-e5ef7734c42b" volumeName="kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952426 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3f295ac-7bc7-43b7-bd30-db82e7f16cd7" volumeName="kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952437 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" volumeName="kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952449 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="632651f7-6641-49d8-9c48-7f6ea5846538" volumeName="kubernetes.io/projected/632651f7-6641-49d8-9c48-7f6ea5846538-kube-api-access-86k82" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952462 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68c57a64-f30c-4caf-89ef-08bd0d36833e" volumeName="kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-service-ca-bundle" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952492 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952505 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" volumeName="kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952516 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="580bafd6-af8c-4961-b959-b736a180e309" volumeName="kubernetes.io/projected/580bafd6-af8c-4961-b959-b736a180e309-kube-api-access-lf74f" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952528 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d47f860-d64a-49b8-b404-a67cbc2faeb6" volumeName="kubernetes.io/projected/9d47f860-d64a-49b8-b404-a67cbc2faeb6-kube-api-access-njx9l" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952566 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9194868-75ce-4138-a9d4-ddd64660c529" volumeName="kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952579 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952594 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/projected/720101f1-0833-45af-a5b7-4910ece2a589-kube-api-access-dcb5s" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952607 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9bc7dea3-1868-488c-a34b-288cde3acd35" volumeName="kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952621 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9bc7dea3-1868-488c-a34b-288cde3acd35" volumeName="kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952634 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952645 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5890f0c-cebe-4788-89f7-27568d875741" volumeName="kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952657 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952670 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ed5a074c-e194-4b16-a4c9-0d82830bf7ca" volumeName="kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-tuned" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952682 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="02d5a507-4409-44b4-98bc-1751cdcc6c6a" volumeName="kubernetes.io/configmap/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cco-trusted-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952693 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3b704e7-1291-4645-8a0d-2a937829d7ac" volumeName="kubernetes.io/projected/f3b704e7-1291-4645-8a0d-2a937829d7ac-kube-api-access-zct7x" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952708 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" volumeName="kubernetes.io/secret/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-machine-approver-tls" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952719 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952731 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" volumeName="kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952742 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3b704e7-1291-4645-8a0d-2a937829d7ac" volumeName="kubernetes.io/secret/f3b704e7-1291-4645-8a0d-2a937829d7ac-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952754 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="021b22e3-b4c5-426d-b761-181f1e54175d" volumeName="kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952766 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="aa8ddfdd-7f2d-4fd4-b666-1497dee752df" volumeName="kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-ca-certs" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952779 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-client" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952791 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" volumeName="kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952802 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d1d16bbc-778b-4fc1-abb2-b43e79a7c532" volumeName="kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952813 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d961a5f0-84b7-47d7-846b-238475947121" volumeName="kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952825 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e64bc838-280e-4231-9732-1adb69fed0bc" volumeName="kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952837 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e64bc838-280e-4231-9732-1adb69fed0bc" volumeName="kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952849 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="021b22e3-b4c5-426d-b761-181f1e54175d" volumeName="kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952861 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ed5a074c-e194-4b16-a4c9-0d82830bf7ca" volumeName="kubernetes.io/projected/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-kube-api-access-qhwbb" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952872 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" volumeName="kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-auth-proxy-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952882 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" volumeName="kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952898 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68c57a64-f30c-4caf-89ef-08bd0d36833e" volumeName="kubernetes.io/secret/68c57a64-f30c-4caf-89ef-08bd0d36833e-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952909 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-trusted-ca-bundle" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952920 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="10498208-0692-4533-b672-a7a2cfcdf1be" volumeName="kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952931 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="114b1d16-b37d-449c-84e3-3fb3f8b20eaa" volumeName="kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952941 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15bf86d9-62b3-4af8-b6f6-23131d712332" volumeName="kubernetes.io/secret/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-key" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952953 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" volumeName="kubernetes.io/projected/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef-kube-api-access-wvrpf" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952965 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae2269d7-f11f-46d1-95e7-f89a70ee1152" volumeName="kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952980 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-trusted-ca-bundle" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.952992 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" volumeName="kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953005 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ed5a074c-e194-4b16-a4c9-0d82830bf7ca" volumeName="kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-tmp" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953017 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="02d5a507-4409-44b4-98bc-1751cdcc6c6a" volumeName="kubernetes.io/secret/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953030 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953043 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="022dd526-0ea5-4224-9d2e-778ed4ef8a56" volumeName="kubernetes.io/empty-dir/022dd526-0ea5-4224-9d2e-778ed4ef8a56-cache" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953055 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-audit-policies" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953067 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953078 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="022dd526-0ea5-4224-9d2e-778ed4ef8a56" volumeName="kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953092 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953103 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666857a1-0ddf-4b48-91f4-44cce154d1b1" volumeName="kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953115 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953127 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="36852fda-6aee-4a36-8724-537f1260c4c8" volumeName="kubernetes.io/projected/36852fda-6aee-4a36-8724-537f1260c4c8-kube-api-access-54547" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953138 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99a11fe6-48a1-439e-b788-158dbe267dcd" volumeName="kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-auth-proxy-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953149 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a154f648-b96d-449e-b0f5-ba32266000c2" volumeName="kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953160 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953172 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-serving-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953183 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-encryption-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953195 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81cb0504-9455-4398-aed1-5cc6790f292e" volumeName="kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-images" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953207 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="51d58450-50bb-4da0-b1f6-4135fbabd856" volumeName="kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953218 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a154f648-b96d-449e-b0f5-ba32266000c2" volumeName="kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953229 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953241 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953252 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953262 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a346ac54-02fe-417f-a49d-038e45b13a1d" volumeName="kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953275 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="114b1d16-b37d-449c-84e3-3fb3f8b20eaa" volumeName="kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953286 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="54612733-158f-4a92-a1bf-f4a8d653ffaf" volumeName="kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953297 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5a012d0b-d1a8-4cd3-8b91-b346d0445f24" volumeName="kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953308 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61ab511b-72e9-4fb9-b5de-770f49514369" volumeName="kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953322 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" volumeName="kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953333 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666857a1-0ddf-4b48-91f4-44cce154d1b1" volumeName="kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953344 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666857a1-0ddf-4b48-91f4-44cce154d1b1" volumeName="kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953356 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81cb0504-9455-4398-aed1-5cc6790f292e" volumeName="kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-auth-proxy-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953370 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bab9dba-235f-467c-9224-634cca9acbd2" volumeName="kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-images" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953382 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9194868-75ce-4138-a9d4-ddd64660c529" volumeName="kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953393 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c62edaec-38e2-4b73-8bb5-c776abfb310f" volumeName="kubernetes.io/projected/c62edaec-38e2-4b73-8bb5-c776abfb310f-kube-api-access-cdfnh" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953404 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea80247e-b4dd-45dc-8255-6e68508c8480" volumeName="kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953415 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f3f295ac-7bc7-43b7-bd30-db82e7f16cd7" volumeName="kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953429 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99a11fe6-48a1-439e-b788-158dbe267dcd" volumeName="kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-images" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953442 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2bab9dba-235f-467c-9224-634cca9acbd2" volumeName="kubernetes.io/secret/2bab9dba-235f-467c-9224-634cca9acbd2-machine-api-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953453 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3c02552c-a477-4c6c-8a45-2fdc758c084b" volumeName="kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953465 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="55bf535c-93ab-4870-a9d2-c02496d71ef0" volumeName="kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953489 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d47f860-d64a-49b8-b404-a67cbc2faeb6" volumeName="kubernetes.io/secret/9d47f860-d64a-49b8-b404-a67cbc2faeb6-metrics-tls" seLinuxMountContext="" Mar 12 12:24:27.953525 master-0 kubenswrapper[13984]: I0312 12:24:27.953501 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954680 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d961a5f0-84b7-47d7-846b-238475947121" volumeName="kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954760 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ea80247e-b4dd-45dc-8255-6e68508c8480" volumeName="kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954778 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="021b22e3-b4c5-426d-b761-181f1e54175d" volumeName="kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954803 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2f3a291a-d9af-4e0f-a307-8928e4dc523d" volumeName="kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954819 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" volumeName="kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954845 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954860 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15bf86d9-62b3-4af8-b6f6-23131d712332" volumeName="kubernetes.io/projected/15bf86d9-62b3-4af8-b6f6-23131d712332-kube-api-access-g7tct" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954875 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a121d0d-d201-446b-97a1-e2414e599f4a" volumeName="kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954895 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954910 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ae2269d7-f11f-46d1-95e7-f89a70ee1152" volumeName="kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954929 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9194868-75ce-4138-a9d4-ddd64660c529" volumeName="kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954942 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-etcd-client" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954956 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="81cb0504-9455-4398-aed1-5cc6790f292e" volumeName="kubernetes.io/secret/81cb0504-9455-4398-aed1-5cc6790f292e-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954976 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8a121d0d-d201-446b-97a1-e2414e599f4a" volumeName="kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.954992 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="99a11fe6-48a1-439e-b788-158dbe267dcd" volumeName="kubernetes.io/projected/99a11fe6-48a1-439e-b788-158dbe267dcd-kube-api-access-ns72p" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955007 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b9194868-75ce-4138-a9d4-ddd64660c529" volumeName="kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955025 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c62edaec-38e2-4b73-8bb5-c776abfb310f" volumeName="kubernetes.io/secret/c62edaec-38e2-4b73-8bb5-c776abfb310f-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955042 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/projected/c873b656-d2aa-4d0e-aa22-9f8d35186473-kube-api-access-kgxbv" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955060 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f19c3c89-8d32-4394-bd86-e5ef7734c42b" volumeName="kubernetes.io/projected/f19c3c89-8d32-4394-bd86-e5ef7734c42b-kube-api-access-s28gq" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955074 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="74d06933-afab-43a3-a1d3-88a569178d34" volumeName="kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955087 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="68c57a64-f30c-4caf-89ef-08bd0d36833e" volumeName="kubernetes.io/projected/68c57a64-f30c-4caf-89ef-08bd0d36833e-kube-api-access-x4nbb" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955106 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a22189f2-3f35-4ea6-9892-39a1b46637e2" volumeName="kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955120 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab087440-bdf2-4e2f-9a5a-434d50a2329a" volumeName="kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955139 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" volumeName="kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955151 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="114b1d16-b37d-449c-84e3-3fb3f8b20eaa" volumeName="kubernetes.io/projected/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-kube-api-access" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955165 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" volumeName="kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955182 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-config" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955199 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c873b656-d2aa-4d0e-aa22-9f8d35186473" volumeName="kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-serving-cert" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955217 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955230 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cfd178d7-f518-413b-95ab-ab6687be6e0f" volumeName="kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955243 13984 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="720101f1-0833-45af-a5b7-4910ece2a589" volumeName="kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-etcd-serving-ca" seLinuxMountContext="" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955254 13984 reconstruct.go:97] "Volume reconstruction finished" Mar 12 12:24:27.957235 master-0 kubenswrapper[13984]: I0312 12:24:27.955269 13984 reconciler.go:26] "Reconciler: start to sync state" Mar 12 12:24:27.974124 master-0 kubenswrapper[13984]: I0312 12:24:27.974042 13984 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 12 12:24:27.978636 master-0 kubenswrapper[13984]: I0312 12:24:27.978608 13984 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 12 12:24:27.978728 master-0 kubenswrapper[13984]: I0312 12:24:27.978664 13984 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 12 12:24:27.978728 master-0 kubenswrapper[13984]: I0312 12:24:27.978685 13984 kubelet.go:2335] "Starting kubelet main sync loop" Mar 12 12:24:27.979106 master-0 kubenswrapper[13984]: E0312 12:24:27.979070 13984 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 12 12:24:27.980312 master-0 kubenswrapper[13984]: I0312 12:24:27.980254 13984 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 12:24:27.988936 master-0 kubenswrapper[13984]: I0312 12:24:27.988845 13984 generic.go:334] "Generic (PLEG): container finished" podID="68c57a64-f30c-4caf-89ef-08bd0d36833e" containerID="66efbdc582547fe0aac5943aa60889d7bd9e3cd56c005e2ffa377552f8953df8" exitCode=0 Mar 12 12:24:27.989093 master-0 kubenswrapper[13984]: I0312 12:24:27.989057 13984 generic.go:334] "Generic (PLEG): container finished" podID="68c57a64-f30c-4caf-89ef-08bd0d36833e" containerID="60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af" exitCode=0 Mar 12 12:24:27.993087 master-0 kubenswrapper[13984]: I0312 12:24:27.993057 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-vpss8_a22189f2-3f35-4ea6-9892-39a1b46637e2/ingress-operator/0.log" Mar 12 12:24:27.993161 master-0 kubenswrapper[13984]: I0312 12:24:27.993105 13984 generic.go:334] "Generic (PLEG): container finished" podID="a22189f2-3f35-4ea6-9892-39a1b46637e2" containerID="97f79ecdfa3c97644b3ca23d2c5dae1dd9db4d81745183dd21308d5c06844fd7" exitCode=1 Mar 12 12:24:28.003776 master-0 kubenswrapper[13984]: I0312 12:24:28.003737 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-rbb5m_61ab511b-72e9-4fb9-b5de-770f49514369/network-operator/0.log" Mar 12 12:24:28.003915 master-0 kubenswrapper[13984]: I0312 12:24:28.003786 13984 generic.go:334] "Generic (PLEG): container finished" podID="61ab511b-72e9-4fb9-b5de-770f49514369" containerID="630c088f3826f86c1fe389a213d79d0dfdd3c10669dd76b8de7210253f979c04" exitCode=255 Mar 12 12:24:28.007719 master-0 kubenswrapper[13984]: I0312 12:24:28.007665 13984 generic.go:334] "Generic (PLEG): container finished" podID="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" containerID="d357ccc688b993b9454b28bfd7fb28a5d58ecf020cbf9839477bf958a0d7b96f" exitCode=0 Mar 12 12:24:28.010883 master-0 kubenswrapper[13984]: I0312 12:24:28.010853 13984 generic.go:334] "Generic (PLEG): container finished" podID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerID="b315bdbdd00e7e78faaebae53e6e5aca4dcfbe013781ad0113a093ac0097dc1b" exitCode=1 Mar 12 12:24:28.013406 master-0 kubenswrapper[13984]: I0312 12:24:28.013371 13984 generic.go:334] "Generic (PLEG): container finished" podID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerID="c6ebc8cdb1ea535bf07be2b08b9bb0f2c20d1bfada4f7f7c593c77044d94f79a" exitCode=0 Mar 12 12:24:28.016170 master-0 kubenswrapper[13984]: I0312 12:24:28.016131 13984 generic.go:334] "Generic (PLEG): container finished" podID="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" containerID="8ee43439e03e174fce129de95caef5dbf8392a0dcca1c8da1e1088570ad3efed" exitCode=0 Mar 12 12:24:28.016170 master-0 kubenswrapper[13984]: I0312 12:24:28.016166 13984 generic.go:334] "Generic (PLEG): container finished" podID="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" containerID="5c4da145a949d5e5c70a918c24d5fac3234eb8326b6c012c2dbb76b6f559a57f" exitCode=0 Mar 12 12:24:28.016290 master-0 kubenswrapper[13984]: I0312 12:24:28.016178 13984 generic.go:334] "Generic (PLEG): container finished" podID="3ebe5b05-95d6-43ff-95a4-0c9c7ce70326" containerID="84b74517ff1cd7d9b9c2c7332d3d21e7e4c1bd81a8eed9a021787271d5de1935" exitCode=0 Mar 12 12:24:28.027337 master-0 kubenswrapper[13984]: I0312 12:24:28.027291 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-sp7w9_9bc7dea3-1868-488c-a34b-288cde3acd35/olm-operator/0.log" Mar 12 12:24:28.027337 master-0 kubenswrapper[13984]: I0312 12:24:28.027338 13984 generic.go:334] "Generic (PLEG): container finished" podID="9bc7dea3-1868-488c-a34b-288cde3acd35" containerID="66ac5cb5ac01ba22dc5debb8648c5e910f8e7732cb1f9e6097ebc2965cc2ccbc" exitCode=1 Mar 12 12:24:28.029358 master-0 kubenswrapper[13984]: I0312 12:24:28.029313 13984 generic.go:334] "Generic (PLEG): container finished" podID="a346ac54-02fe-417f-a49d-038e45b13a1d" containerID="ebdafa22bf6b7ed28319fcbd34230d3e124233b075083adefec56e18e5a788b3" exitCode=0 Mar 12 12:24:28.033877 master-0 kubenswrapper[13984]: I0312 12:24:28.033825 13984 generic.go:334] "Generic (PLEG): container finished" podID="55bf535c-93ab-4870-a9d2-c02496d71ef0" containerID="6a3b8be971ca63800f0532603d6fc3d806dc294fa7565d4894a90520eb420540" exitCode=0 Mar 12 12:24:28.045686 master-0 kubenswrapper[13984]: E0312 12:24:28.040880 13984 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:24:28.047877 master-0 kubenswrapper[13984]: I0312 12:24:28.047830 13984 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="609bc41bd5850647fabc2e01d12345f9b41d6cc4ea84bcb7679ae4b6d13d442e" exitCode=0 Mar 12 12:24:28.047877 master-0 kubenswrapper[13984]: I0312 12:24:28.047869 13984 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="fb3151f0498b4d271395613cdb5a66c2bdbd18c371f71b100988d9a1524ba2df" exitCode=0 Mar 12 12:24:28.047972 master-0 kubenswrapper[13984]: I0312 12:24:28.047881 13984 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="ea68604b446ee8cffe24318b7151377c7b04f157b8ea561b83368baecd127158" exitCode=0 Mar 12 12:24:28.047972 master-0 kubenswrapper[13984]: I0312 12:24:28.047891 13984 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="ef1ee7f5f63359043faa6c46c732e5167be16d3211712dbb5e513926f5b91304" exitCode=0 Mar 12 12:24:28.047972 master-0 kubenswrapper[13984]: I0312 12:24:28.047901 13984 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="9b70d60ef3db50f988d7297005c87ae9142093113f8ee25c0d2a4d1f3023050e" exitCode=0 Mar 12 12:24:28.047972 master-0 kubenswrapper[13984]: I0312 12:24:28.047910 13984 generic.go:334] "Generic (PLEG): container finished" podID="10498208-0692-4533-b672-a7a2cfcdf1be" containerID="05ebbc8f6ffb604ea4cb658572a6553a226fea47b2132dba48a9b8a612eeb8a1" exitCode=0 Mar 12 12:24:28.062597 master-0 kubenswrapper[13984]: I0312 12:24:28.062555 13984 generic.go:334] "Generic (PLEG): container finished" podID="6571f5e5-07ee-4e6c-a8ad-277bc52e35ee" containerID="78cb426b98a54442332ae7dea069dbb75e6d07a8377b812d61f3d58bf1a33d17" exitCode=0 Mar 12 12:24:28.069661 master-0 kubenswrapper[13984]: I0312 12:24:28.069617 13984 generic.go:334] "Generic (PLEG): container finished" podID="8a121d0d-d201-446b-97a1-e2414e599f4a" containerID="13d86bfb78c4cbeb08e1a2822a58d3b19158c64d8933a782a2465848cf9de135" exitCode=0 Mar 12 12:24:28.073141 master-0 kubenswrapper[13984]: I0312 12:24:28.073103 13984 generic.go:334] "Generic (PLEG): container finished" podID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerID="d6ff10b313edaeab618ba4bd948891faf24292c7fee20c8b60ece8104bb06b3a" exitCode=0 Mar 12 12:24:28.078263 master-0 kubenswrapper[13984]: I0312 12:24:28.078236 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/1.log" Mar 12 12:24:28.079151 master-0 kubenswrapper[13984]: E0312 12:24:28.079128 13984 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 12 12:24:28.080572 master-0 kubenswrapper[13984]: I0312 12:24:28.080550 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/0.log" Mar 12 12:24:28.080649 master-0 kubenswrapper[13984]: I0312 12:24:28.080579 13984 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="06f4c15730d5d23bfb91ec1bbc7bab14f9a3a3ae32c22b935d487a1f88576da3" exitCode=1 Mar 12 12:24:28.080649 master-0 kubenswrapper[13984]: I0312 12:24:28.080596 13984 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14" exitCode=1 Mar 12 12:24:28.082804 master-0 kubenswrapper[13984]: I0312 12:24:28.082468 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-rzmhl_51d58450-50bb-4da0-b1f6-4135fbabd856/approver/0.log" Mar 12 12:24:28.082804 master-0 kubenswrapper[13984]: I0312 12:24:28.082696 13984 generic.go:334] "Generic (PLEG): container finished" podID="51d58450-50bb-4da0-b1f6-4135fbabd856" containerID="2c2b5cd50e4b41a7c3aafd02e56e622ce6b2150721ba8e3603b831c988a04475" exitCode=1 Mar 12 12:24:28.090118 master-0 kubenswrapper[13984]: I0312 12:24:28.090075 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-cg7rd_b5890f0c-cebe-4788-89f7-27568d875741/openshift-controller-manager-operator/0.log" Mar 12 12:24:28.090261 master-0 kubenswrapper[13984]: I0312 12:24:28.090127 13984 generic.go:334] "Generic (PLEG): container finished" podID="b5890f0c-cebe-4788-89f7-27568d875741" containerID="54c0c581483d3deef8c82f62f09c7eb9259f3f30693873246ec5154f0dcb5178" exitCode=1 Mar 12 12:24:28.091699 master-0 kubenswrapper[13984]: I0312 12:24:28.091670 13984 generic.go:334] "Generic (PLEG): container finished" podID="9b960fe2-d59e-4ee1-bd9d-455b46753cb9" containerID="2ecd7f48de11aae6e5506fd79aa229be8956b481497e7ee996afbf26849c14c9" exitCode=0 Mar 12 12:24:28.094188 master-0 kubenswrapper[13984]: I0312 12:24:28.094137 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-nwk7v_d961a5f0-84b7-47d7-846b-238475947121/catalog-operator/0.log" Mar 12 12:24:28.094188 master-0 kubenswrapper[13984]: I0312 12:24:28.094182 13984 generic.go:334] "Generic (PLEG): container finished" podID="d961a5f0-84b7-47d7-846b-238475947121" containerID="500e92ed2e084af2e7e87754fde25e08ea680d95e255a969f1e5b249d9b80765" exitCode=1 Mar 12 12:24:28.096987 master-0 kubenswrapper[13984]: I0312 12:24:28.096934 13984 generic.go:334] "Generic (PLEG): container finished" podID="a154f648-b96d-449e-b0f5-ba32266000c2" containerID="962d17c59c83032444283250e4b6771dc0674c33ed005bbb62fede3e00c87666" exitCode=0 Mar 12 12:24:28.096987 master-0 kubenswrapper[13984]: I0312 12:24:28.096979 13984 generic.go:334] "Generic (PLEG): container finished" podID="a154f648-b96d-449e-b0f5-ba32266000c2" containerID="4200e813be631eda72870daef6f5fca8205e93c466ebc5b8aa512a46303fa669" exitCode=0 Mar 12 12:24:28.107001 master-0 kubenswrapper[13984]: I0312 12:24:28.106952 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f0f3c0b6faeeb349351f1b371244c103cf179f81f4ae7b1577ee82387a636818" exitCode=0 Mar 12 12:24:28.109835 master-0 kubenswrapper[13984]: I0312 12:24:28.109790 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d" exitCode=1 Mar 12 12:24:28.117245 master-0 kubenswrapper[13984]: I0312 12:24:28.117223 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/1.log" Mar 12 12:24:28.117765 master-0 kubenswrapper[13984]: I0312 12:24:28.117735 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/0.log" Mar 12 12:24:28.118086 master-0 kubenswrapper[13984]: I0312 12:24:28.118059 13984 generic.go:334] "Generic (PLEG): container finished" podID="632651f7-6641-49d8-9c48-7f6ea5846538" containerID="c119461954016ba56cd4650cbf6e8e2b03da1364b0a52ff3ba4437048b8fac29" exitCode=255 Mar 12 12:24:28.118086 master-0 kubenswrapper[13984]: I0312 12:24:28.118083 13984 generic.go:334] "Generic (PLEG): container finished" podID="632651f7-6641-49d8-9c48-7f6ea5846538" containerID="a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb" exitCode=255 Mar 12 12:24:28.119737 master-0 kubenswrapper[13984]: I0312 12:24:28.119715 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_b7aa62dd-2de4-4511-a7e7-27f45fe97cc1/installer/0.log" Mar 12 12:24:28.119818 master-0 kubenswrapper[13984]: I0312 12:24:28.119784 13984 generic.go:334] "Generic (PLEG): container finished" podID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerID="1d7fe59666c56c328d192c878aec970270ae5b794f380892a172dc12ef6839ec" exitCode=1 Mar 12 12:24:28.123855 master-0 kubenswrapper[13984]: I0312 12:24:28.123823 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/0.log" Mar 12 12:24:28.124266 master-0 kubenswrapper[13984]: I0312 12:24:28.124236 13984 generic.go:334] "Generic (PLEG): container finished" podID="81cb0504-9455-4398-aed1-5cc6790f292e" containerID="db3a8ef068f83e5dd24c99272c185086b8f1f58c8b82fffeba41265fdc76efe5" exitCode=1 Mar 12 12:24:28.125990 master-0 kubenswrapper[13984]: I0312 12:24:28.125944 13984 generic.go:334] "Generic (PLEG): container finished" podID="720101f1-0833-45af-a5b7-4910ece2a589" containerID="f1911a04d932c171347c3c6692dcdee354bd0734087d8540b5936ed3d825696a" exitCode=0 Mar 12 12:24:28.127722 master-0 kubenswrapper[13984]: I0312 12:24:28.127696 13984 generic.go:334] "Generic (PLEG): container finished" podID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" containerID="85782e605fa997f78fa1b00da3fcf4b854ebddeada29c56b1d38c44587c26563" exitCode=0 Mar 12 12:24:28.128929 master-0 kubenswrapper[13984]: I0312 12:24:28.128890 13984 generic.go:334] "Generic (PLEG): container finished" podID="a59a6bb7-f966-4208-ba85-452095404891" containerID="b0b0a71bb15ee38a2037cc0d67a425037c9a862e431396ce17c0501ae76f6aae" exitCode=1 Mar 12 12:24:28.136276 master-0 kubenswrapper[13984]: I0312 12:24:28.136242 13984 generic.go:334] "Generic (PLEG): container finished" podID="f04121eb-5c7b-42cd-a2e2-26cf1c67593d" containerID="0f612f745cd190b544d29d6db2a18fd47dfd753a5c71d5de2a3d0a726aefe224" exitCode=0 Mar 12 12:24:28.143960 master-0 kubenswrapper[13984]: I0312 12:24:28.143044 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 12 12:24:28.143960 master-0 kubenswrapper[13984]: I0312 12:24:28.143493 13984 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27" exitCode=1 Mar 12 12:24:28.143960 master-0 kubenswrapper[13984]: I0312 12:24:28.143532 13984 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="db3181f1b8f0872f0e0be3a238121d5924cf03894cfb4adac07406dd2be14404" exitCode=0 Mar 12 12:24:28.146344 master-0 kubenswrapper[13984]: I0312 12:24:28.145461 13984 generic.go:334] "Generic (PLEG): container finished" podID="ab087440-bdf2-4e2f-9a5a-434d50a2329a" containerID="c10ad00e6d5ca94dd8aee1068bcae9a35fd5744bbc9fa9703850b00e0063db31" exitCode=0 Mar 12 12:24:28.146344 master-0 kubenswrapper[13984]: E0312 12:24:28.145818 13984 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:24:28.148374 master-0 kubenswrapper[13984]: I0312 12:24:28.148318 13984 generic.go:334] "Generic (PLEG): container finished" podID="c873b656-d2aa-4d0e-aa22-9f8d35186473" containerID="cff5fae1b95c8c229432e602087c75f134e2f63191ff0620438ff838fbf87945" exitCode=0 Mar 12 12:24:28.153244 master-0 kubenswrapper[13984]: I0312 12:24:28.153191 13984 generic.go:334] "Generic (PLEG): container finished" podID="ea80247e-b4dd-45dc-8255-6e68508c8480" containerID="fc4b4e674883cb3e17ae8f6229f477c4fc095a1d76196bd6eee19ed1d8bb25c9" exitCode=0 Mar 12 12:24:28.158680 master-0 kubenswrapper[13984]: I0312 12:24:28.158620 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="d7e05b92a4dbd9fdd6089f7478db7952000d8da47fb29fa9de9acabcf994c90c" exitCode=0 Mar 12 12:24:28.158680 master-0 kubenswrapper[13984]: I0312 12:24:28.158655 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="17df0049e355b3a960768281cd9fb4fe90537eac08f31c82188b349d802deef8" exitCode=0 Mar 12 12:24:28.158680 master-0 kubenswrapper[13984]: I0312 12:24:28.158664 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="c768108f96dad44b9e3bcf8d0d6db5eb9d2c1ac1b865d93bd4ff9f67e7bb635a" exitCode=0 Mar 12 12:24:28.160634 master-0 kubenswrapper[13984]: I0312 12:24:28.160599 13984 generic.go:334] "Generic (PLEG): container finished" podID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerID="823aef6fae9a1e0ac9ce3e87b09c4b094495f36691d261ce34a1a0d40c54755e" exitCode=0 Mar 12 12:24:28.162390 master-0 kubenswrapper[13984]: I0312 12:24:28.162340 13984 generic.go:334] "Generic (PLEG): container finished" podID="a97fcd56-aa52-414a-b370-154c1b34c1ed" containerID="9e58c221e9a2e73d89eb52ff2e2377c97caf0ea7574d33f3dc1598a292639881" exitCode=0 Mar 12 12:24:28.245971 master-0 kubenswrapper[13984]: E0312 12:24:28.245861 13984 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:24:28.273962 master-0 kubenswrapper[13984]: I0312 12:24:28.273880 13984 manager.go:324] Recovery completed Mar 12 12:24:28.279272 master-0 kubenswrapper[13984]: E0312 12:24:28.279222 13984 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 12 12:24:28.343319 master-0 kubenswrapper[13984]: I0312 12:24:28.343276 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.345052 master-0 kubenswrapper[13984]: I0312 12:24:28.345029 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.345185 master-0 kubenswrapper[13984]: I0312 12:24:28.345171 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.345278 master-0 kubenswrapper[13984]: I0312 12:24:28.345265 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.346208 master-0 kubenswrapper[13984]: E0312 12:24:28.346183 13984 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 12 12:24:28.349264 master-0 kubenswrapper[13984]: I0312 12:24:28.349213 13984 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 12 12:24:28.349264 master-0 kubenswrapper[13984]: I0312 12:24:28.349261 13984 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 12 12:24:28.349357 master-0 kubenswrapper[13984]: I0312 12:24:28.349308 13984 state_mem.go:36] "Initialized new in-memory state store" Mar 12 12:24:28.349671 master-0 kubenswrapper[13984]: I0312 12:24:28.349634 13984 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 12 12:24:28.349705 master-0 kubenswrapper[13984]: I0312 12:24:28.349670 13984 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 12 12:24:28.349746 master-0 kubenswrapper[13984]: I0312 12:24:28.349707 13984 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 12 12:24:28.349746 master-0 kubenswrapper[13984]: I0312 12:24:28.349721 13984 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 12 12:24:28.349746 master-0 kubenswrapper[13984]: I0312 12:24:28.349734 13984 policy_none.go:49] "None policy: Start" Mar 12 12:24:28.353648 master-0 kubenswrapper[13984]: I0312 12:24:28.353630 13984 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 12 12:24:28.353731 master-0 kubenswrapper[13984]: I0312 12:24:28.353722 13984 state_mem.go:35] "Initializing new in-memory state store" Mar 12 12:24:28.354026 master-0 kubenswrapper[13984]: I0312 12:24:28.354014 13984 state_mem.go:75] "Updated machine memory state" Mar 12 12:24:28.354125 master-0 kubenswrapper[13984]: I0312 12:24:28.354116 13984 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 12 12:24:28.368819 master-0 kubenswrapper[13984]: I0312 12:24:28.368781 13984 manager.go:334] "Starting Device Plugin manager" Mar 12 12:24:28.368966 master-0 kubenswrapper[13984]: I0312 12:24:28.368832 13984 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 12 12:24:28.368966 master-0 kubenswrapper[13984]: I0312 12:24:28.368847 13984 server.go:79] "Starting device plugin registration server" Mar 12 12:24:28.369283 master-0 kubenswrapper[13984]: I0312 12:24:28.369262 13984 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 12 12:24:28.369319 master-0 kubenswrapper[13984]: I0312 12:24:28.369279 13984 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 12 12:24:28.369568 master-0 kubenswrapper[13984]: I0312 12:24:28.369438 13984 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 12 12:24:28.369985 master-0 kubenswrapper[13984]: I0312 12:24:28.369955 13984 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 12 12:24:28.370058 master-0 kubenswrapper[13984]: I0312 12:24:28.369992 13984 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 12 12:24:28.375862 master-0 kubenswrapper[13984]: E0312 12:24:28.375814 13984 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 12 12:24:28.470035 master-0 kubenswrapper[13984]: I0312 12:24:28.469928 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.472647 master-0 kubenswrapper[13984]: I0312 12:24:28.472573 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.472647 master-0 kubenswrapper[13984]: I0312 12:24:28.472619 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.472647 master-0 kubenswrapper[13984]: I0312 12:24:28.472631 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.472647 master-0 kubenswrapper[13984]: I0312 12:24:28.472658 13984 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:24:28.476394 master-0 kubenswrapper[13984]: E0312 12:24:28.476340 13984 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 12 12:24:28.677391 master-0 kubenswrapper[13984]: I0312 12:24:28.677236 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.679587 master-0 kubenswrapper[13984]: I0312 12:24:28.679517 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-controller-manager/kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 12 12:24:28.679730 master-0 kubenswrapper[13984]: I0312 12:24:28.679634 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.679857 master-0 kubenswrapper[13984]: I0312 12:24:28.679766 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.679857 master-0 kubenswrapper[13984]: I0312 12:24:28.679808 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.679857 master-0 kubenswrapper[13984]: I0312 12:24:28.679821 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.679857 master-0 kubenswrapper[13984]: I0312 12:24:28.679843 13984 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:24:28.681880 master-0 kubenswrapper[13984]: I0312 12:24:28.681851 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.681880 master-0 kubenswrapper[13984]: I0312 12:24:28.681909 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.681880 master-0 kubenswrapper[13984]: I0312 12:24:28.681923 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.682175 master-0 kubenswrapper[13984]: I0312 12:24:28.682039 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.682291 master-0 kubenswrapper[13984]: I0312 12:24:28.682260 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.682886 master-0 kubenswrapper[13984]: E0312 12:24:28.682856 13984 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 12 12:24:28.684214 master-0 kubenswrapper[13984]: I0312 12:24:28.684165 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.684214 master-0 kubenswrapper[13984]: I0312 12:24:28.684193 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.684214 master-0 kubenswrapper[13984]: I0312 12:24:28.684214 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.684214 master-0 kubenswrapper[13984]: I0312 12:24:28.684231 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.684621 master-0 kubenswrapper[13984]: I0312 12:24:28.684230 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.684621 master-0 kubenswrapper[13984]: I0312 12:24:28.684244 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.684621 master-0 kubenswrapper[13984]: I0312 12:24:28.684390 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.684621 master-0 kubenswrapper[13984]: I0312 12:24:28.684549 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.686262 master-0 kubenswrapper[13984]: I0312 12:24:28.686224 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.686398 master-0 kubenswrapper[13984]: I0312 12:24:28.686269 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.686398 master-0 kubenswrapper[13984]: I0312 12:24:28.686284 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.686809 master-0 kubenswrapper[13984]: I0312 12:24:28.686417 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.686809 master-0 kubenswrapper[13984]: I0312 12:24:28.686559 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.688981 master-0 kubenswrapper[13984]: I0312 12:24:28.688903 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.688981 master-0 kubenswrapper[13984]: I0312 12:24:28.688930 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.688981 master-0 kubenswrapper[13984]: I0312 12:24:28.688940 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.689768 master-0 kubenswrapper[13984]: I0312 12:24:28.689158 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.689768 master-0 kubenswrapper[13984]: I0312 12:24:28.689180 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.689768 master-0 kubenswrapper[13984]: I0312 12:24:28.689190 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.689768 master-0 kubenswrapper[13984]: I0312 12:24:28.689660 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.689768 master-0 kubenswrapper[13984]: I0312 12:24:28.689679 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.689768 master-0 kubenswrapper[13984]: I0312 12:24:28.689694 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.690278 master-0 kubenswrapper[13984]: I0312 12:24:28.689821 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.690278 master-0 kubenswrapper[13984]: I0312 12:24:28.689998 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.692736 master-0 kubenswrapper[13984]: I0312 12:24:28.692683 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.692736 master-0 kubenswrapper[13984]: I0312 12:24:28.692711 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.692736 master-0 kubenswrapper[13984]: I0312 12:24:28.692721 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.693145 master-0 kubenswrapper[13984]: I0312 12:24:28.692821 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.693145 master-0 kubenswrapper[13984]: I0312 12:24:28.693049 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.696876 master-0 kubenswrapper[13984]: I0312 12:24:28.696820 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.696876 master-0 kubenswrapper[13984]: I0312 12:24:28.696857 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.696876 master-0 kubenswrapper[13984]: I0312 12:24:28.696869 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.699368 master-0 kubenswrapper[13984]: I0312 12:24:28.699319 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.699535 master-0 kubenswrapper[13984]: I0312 12:24:28.699409 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.699535 master-0 kubenswrapper[13984]: I0312 12:24:28.699436 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.700124 master-0 kubenswrapper[13984]: I0312 12:24:28.699816 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.700124 master-0 kubenswrapper[13984]: I0312 12:24:28.700120 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.700124 master-0 kubenswrapper[13984]: I0312 12:24:28.700129 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700231 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7590fb693037429602853336fcf3ab3ffbbd224c01c7f6477abe20ca5e6814ec" Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700267 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf3d062904eac9ebbd852d158da7968f568f0b7b439f11aee315609af8d30a5c" Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700277 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd601e94b2e497fb8ecd7edce09ab1fd613fe37dc7cd5a8b1f710f61827b3468" Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700340 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"faf97dd3763176aab30d820af53bb9c984317f76cbc30d430d1b3c10226441f2"} Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700391 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"52a33ee59d801577da8aa50316a71355f9bbd8f241a6f2bf41da5e717c4566be"} Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700425 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63bee47de9c3b2c5c186d247e833d0879db6401ab7c44aeecd60140ee3df2506" Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700435 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"a8fedfcde448bdde81261d7ac4ac60d9af5f9d81f27ef839a88514a7008192a6"} Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700445 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"7405ab984b4b29bcfc8c9051e359b9cf1d8a4e58dd56cb0807b7d8e784ee3072"} Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700455 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"2e1c10ff585ac62d9bd97610178ea53b8be2cf0963171b0a4671fc1da4d09ac1"} Mar 12 12:24:28.700446 master-0 kubenswrapper[13984]: I0312 12:24:28.700464 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"cfafaefe1186e2aaf0be8643dd0cbdc3cd2e42fe49f94ac73da95ad15ec95688"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700487 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700599 13984 scope.go:117] "RemoveContainer" containerID="60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af" Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700550 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"a3e42264f81889f53565ef287ba1a41d67422c625a1e231bc34f310555991c94"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700822 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"dbc67a0df6d005a6dc51cf6a3278cefd084341f566051f53aeef112699601de2"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700832 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"531a537f715644fc0623e243f05c5d1fb97e1cf2fd31e77e02a07259ef5d606f"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700841 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f6fe46052f0fc88bb12a846f99f359af9fcb5cab5db347f3b65949fb3346c632"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700850 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"700614ee74998a2f22b9ab087ce7bf59a848008e38ccd5dd9ed6bc2ca56e75b9"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700859 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"f0f3c0b6faeeb349351f1b371244c103cf179f81f4ae7b1577ee82387a636818"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700869 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df"} Mar 12 12:24:28.701060 master-0 kubenswrapper[13984]: I0312 12:24:28.700715 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.700877 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"727b627c806117dd5f2141d70aae9b4f04fa57747cadae9611a9e80d6ca1b04b"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701630 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701646 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701675 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06325f241e9a80c6d8dd85ba7c0c06cd73e1284031ca3e0cffe506e69d0bffc7" Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701707 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a035c7049c39dfacabd51a96c5380843e394f28f3ad6d0e9bceda8fa427e90" Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701718 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d67f03c484ab80e82af34bfbee706a0210f43b50a2db09932840ccaa2c6ed0f8" Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701750 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="954043e71beb49d9918acfe524634995e0ee50c78b0c2c1e54744a1a30d16320" Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701759 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a732af73f8df5350fffca3678cce89972e48b40d0d2aaf3a11513ec460452d56"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701772 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e14a9901dd6e8c0b40633b500b5bd04a476d40c6df16f6b87f61140da2866a27"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701785 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"db3181f1b8f0872f0e0be3a238121d5924cf03894cfb4adac07406dd2be14404"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701797 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"83c7ea40f697796fd897b694662a9cd6658e7c3212140fcf936f6482cc114dbe"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701831 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"75e3bbf6064ff7f35e82389d300ce882963ffc8541436a1a80d239d3b971f5e4"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701844 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"49b7eed659093dd9559e33b3a0f638adb7dbba47c5e4ef561aa467ffc52b0eec"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701856 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"f7059097bb2ddbff3e211fcd38dee654f074ce10a4773944219fb7905e3d5723"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701866 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"b2f5073adcb260f7bafd0dbb4eae76a0d78ce200196488dbbf37a087b47f06a5"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701877 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"b0a55edaf166c34b98b5dcf71563b10a7151eb2e9e60290d0d641d4546feecf7"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701908 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"d7e05b92a4dbd9fdd6089f7478db7952000d8da47fb29fa9de9acabcf994c90c"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701965 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"17df0049e355b3a960768281cd9fb4fe90537eac08f31c82188b349d802deef8"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701976 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"c768108f96dad44b9e3bcf8d0d6db5eb9d2c1ac1b865d93bd4ff9f67e7bb635a"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.701987 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8"} Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.702000 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68fa0cf2fbdf6df9219e97b1643b059a7dd5beab8d56030f1c6403c3ac0499a2" Mar 12 12:24:28.702842 master-0 kubenswrapper[13984]: I0312 12:24:28.702012 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3cb6ab6d4b334e440fd45f035b87035c3f267128290d2900f9474bca9932ecf6" Mar 12 12:24:28.704135 master-0 kubenswrapper[13984]: I0312 12:24:28.704032 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:28.704135 master-0 kubenswrapper[13984]: I0312 12:24:28.704047 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:28.704135 master-0 kubenswrapper[13984]: I0312 12:24:28.704055 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:28.724165 master-0 kubenswrapper[13984]: I0312 12:24:28.724129 13984 scope.go:117] "RemoveContainer" containerID="60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af" Mar 12 12:24:28.724573 master-0 kubenswrapper[13984]: E0312 12:24:28.724470 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af\": container with ID starting with 60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af not found: ID does not exist" containerID="60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af" Mar 12 12:24:28.724696 master-0 kubenswrapper[13984]: I0312 12:24:28.724529 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af"} err="failed to get container status \"60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af\": rpc error: code = NotFound desc = could not find container \"60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af\": container with ID starting with 60f87b755755295fad8e7c56769fe7d1cd3602c3f4fe7e3b847eaae62ab922af not found: ID does not exist" Mar 12 12:24:28.724696 master-0 kubenswrapper[13984]: I0312 12:24:28.724607 13984 scope.go:117] "RemoveContainer" containerID="8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14" Mar 12 12:24:28.743882 master-0 kubenswrapper[13984]: I0312 12:24:28.743823 13984 scope.go:117] "RemoveContainer" containerID="8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14" Mar 12 12:24:28.744291 master-0 kubenswrapper[13984]: E0312 12:24:28.744191 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14\": container with ID starting with 8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14 not found: ID does not exist" containerID="8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14" Mar 12 12:24:28.744291 master-0 kubenswrapper[13984]: I0312 12:24:28.744278 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14"} err="failed to get container status \"8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14\": rpc error: code = NotFound desc = could not find container \"8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14\": container with ID starting with 8a752e27bc851e2391765443ed1b03d27b8204979fa134a7f1e83ddf2809ad14 not found: ID does not exist" Mar 12 12:24:28.744513 master-0 kubenswrapper[13984]: I0312 12:24:28.744312 13984 scope.go:117] "RemoveContainer" containerID="a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb" Mar 12 12:24:28.761659 master-0 kubenswrapper[13984]: I0312 12:24:28.761598 13984 scope.go:117] "RemoveContainer" containerID="a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb" Mar 12 12:24:28.762093 master-0 kubenswrapper[13984]: E0312 12:24:28.761999 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb\": container with ID starting with a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb not found: ID does not exist" containerID="a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb" Mar 12 12:24:28.762093 master-0 kubenswrapper[13984]: I0312 12:24:28.762039 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb"} err="failed to get container status \"a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb\": rpc error: code = NotFound desc = could not find container \"a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb\": container with ID starting with a28d6a51bd2d6acff72fcedbf328ab86864ae89d4aa0aa904b620f41a25b71eb not found: ID does not exist" Mar 12 12:24:29.083838 master-0 kubenswrapper[13984]: I0312 12:24:29.083788 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:29.086488 master-0 kubenswrapper[13984]: I0312 12:24:29.086172 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:29.086488 master-0 kubenswrapper[13984]: I0312 12:24:29.086201 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:29.086488 master-0 kubenswrapper[13984]: I0312 12:24:29.086213 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:29.086488 master-0 kubenswrapper[13984]: I0312 12:24:29.086232 13984 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:24:29.090255 master-0 kubenswrapper[13984]: E0312 12:24:29.090213 13984 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 12 12:24:29.178354 master-0 kubenswrapper[13984]: I0312 12:24:29.178301 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/1.log" Mar 12 12:24:29.182028 master-0 kubenswrapper[13984]: I0312 12:24:29.181994 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/1.log" Mar 12 12:24:29.890766 master-0 kubenswrapper[13984]: I0312 12:24:29.890685 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:29.892762 master-0 kubenswrapper[13984]: I0312 12:24:29.892689 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:29.892762 master-0 kubenswrapper[13984]: I0312 12:24:29.892743 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:29.892762 master-0 kubenswrapper[13984]: I0312 12:24:29.892770 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:29.893039 master-0 kubenswrapper[13984]: I0312 12:24:29.892791 13984 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:24:29.898850 master-0 kubenswrapper[13984]: E0312 12:24:29.898788 13984 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 12 12:24:31.499615 master-0 kubenswrapper[13984]: I0312 12:24:31.499539 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:31.503970 master-0 kubenswrapper[13984]: I0312 12:24:31.503716 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:31.503970 master-0 kubenswrapper[13984]: I0312 12:24:31.503784 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:31.503970 master-0 kubenswrapper[13984]: I0312 12:24:31.503804 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:31.503970 master-0 kubenswrapper[13984]: I0312 12:24:31.503837 13984 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:24:31.508075 master-0 kubenswrapper[13984]: E0312 12:24:31.508018 13984 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 12 12:24:32.202014 master-0 kubenswrapper[13984]: I0312 12:24:32.201960 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="727b627c806117dd5f2141d70aae9b4f04fa57747cadae9611a9e80d6ca1b04b" exitCode=1 Mar 12 12:24:32.202237 master-0 kubenswrapper[13984]: I0312 12:24:32.202019 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"727b627c806117dd5f2141d70aae9b4f04fa57747cadae9611a9e80d6ca1b04b"} Mar 12 12:24:32.202237 master-0 kubenswrapper[13984]: I0312 12:24:32.202069 13984 scope.go:117] "RemoveContainer" containerID="f6dfc890ce63c2776178106266c3e0c423699c94ff93d96f132aa161452fe91d" Mar 12 12:24:32.937591 master-0 kubenswrapper[13984]: I0312 12:24:32.937468 13984 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 12:24:32.938640 master-0 kubenswrapper[13984]: I0312 12:24:32.938277 13984 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 12:24:32.942273 master-0 kubenswrapper[13984]: I0312 12:24:32.941573 13984 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 12:24:32.959973 master-0 kubenswrapper[13984]: I0312 12:24:32.959884 13984 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 12 12:24:33.060201 master-0 kubenswrapper[13984]: I0312 12:24:33.060092 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.060201 master-0 kubenswrapper[13984]: I0312 12:24:33.060159 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.060201 master-0 kubenswrapper[13984]: I0312 12:24:33.060190 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060267 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060291 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060314 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060341 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060365 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060392 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060431 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060460 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060510 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060563 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060605 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.060628 master-0 kubenswrapper[13984]: I0312 12:24:33.060630 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.061374 master-0 kubenswrapper[13984]: I0312 12:24:33.060653 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.061374 master-0 kubenswrapper[13984]: I0312 12:24:33.060676 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.061374 master-0 kubenswrapper[13984]: I0312 12:24:33.060698 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.061374 master-0 kubenswrapper[13984]: I0312 12:24:33.060726 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.061374 master-0 kubenswrapper[13984]: I0312 12:24:33.060746 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.161707 master-0 kubenswrapper[13984]: I0312 12:24:33.161633 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.161707 master-0 kubenswrapper[13984]: I0312 12:24:33.161626 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.162014 master-0 kubenswrapper[13984]: I0312 12:24:33.161788 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.162014 master-0 kubenswrapper[13984]: I0312 12:24:33.161826 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.162014 master-0 kubenswrapper[13984]: I0312 12:24:33.161941 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.162228 master-0 kubenswrapper[13984]: I0312 12:24:33.162087 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.162228 master-0 kubenswrapper[13984]: I0312 12:24:33.162068 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.162228 master-0 kubenswrapper[13984]: I0312 12:24:33.162156 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.162228 master-0 kubenswrapper[13984]: I0312 12:24:33.162191 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.162463 master-0 kubenswrapper[13984]: I0312 12:24:33.162241 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.162463 master-0 kubenswrapper[13984]: I0312 12:24:33.162329 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.162463 master-0 kubenswrapper[13984]: I0312 12:24:33.162432 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162470 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162579 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162594 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162623 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162535 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162627 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162681 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.162696 master-0 kubenswrapper[13984]: I0312 12:24:33.162699 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162733 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162766 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162819 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162836 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162867 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162877 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162820 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162889 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162916 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162928 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162931 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162970 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.162990 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163017 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163061 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163063 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163080 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163117 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163117 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.163184 master-0 kubenswrapper[13984]: I0312 12:24:33.163140 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.190097 master-0 kubenswrapper[13984]: I0312 12:24:33.189958 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.190097 master-0 kubenswrapper[13984]: I0312 12:24:33.190068 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.190292 master-0 kubenswrapper[13984]: I0312 12:24:33.190172 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.195568 master-0 kubenswrapper[13984]: I0312 12:24:33.195506 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.211126 master-0 kubenswrapper[13984]: I0312 12:24:33.211064 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.231978 master-0 kubenswrapper[13984]: E0312 12:24:33.231901 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:24:33.231978 master-0 kubenswrapper[13984]: E0312 12:24:33.231978 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:33.232342 master-0 kubenswrapper[13984]: E0312 12:24:33.232051 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-master-0\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:33.232342 master-0 kubenswrapper[13984]: E0312 12:24:33.231918 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 12 12:24:33.232342 master-0 kubenswrapper[13984]: E0312 12:24:33.232174 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:33.232553 master-0 kubenswrapper[13984]: I0312 12:24:33.232131 13984 scope.go:117] "RemoveContainer" containerID="727b627c806117dd5f2141d70aae9b4f04fa57747cadae9611a9e80d6ca1b04b" Mar 12 12:24:33.232725 master-0 kubenswrapper[13984]: E0312 12:24:33.232671 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 12 12:24:33.924661 master-0 kubenswrapper[13984]: I0312 12:24:33.924512 13984 apiserver.go:52] "Watching apiserver" Mar 12 12:24:33.944406 master-0 kubenswrapper[13984]: I0312 12:24:33.944365 13984 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 12:24:33.946405 master-0 kubenswrapper[13984]: I0312 12:24:33.946357 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4","openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4","openshift-insights/insights-operator-8f89dfddd-4vzl8","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-network-diagnostics/network-check-target-dfz7x","openshift-network-operator/network-operator-7c649bf6d4-rbb5m","openshift-ovn-kubernetes/ovnkube-node-l5d2w","openshift-service-ca/service-ca-84bfdbbb7f-gxx99","openshift-kube-controller-manager/installer-2-master-0","openshift-kube-scheduler/installer-4-master-0","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr","openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf","openshift-ingress-operator/ingress-operator-677db989d6-vpss8","openshift-multus/multus-admission-controller-8d675b596-xpzn2","openshift-cluster-node-tuning-operator/tuned-9zrvj","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd","openshift-multus/multus-hb48g","kube-system/bootstrap-kube-scheduler-master-0","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7","openshift-etcd/etcd-master-0","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp","openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w","openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg","openshift-dns-operator/dns-operator-589895fbb7-l8x6p","openshift-dns/node-resolver-72w9q","openshift-multus/multus-additional-cni-plugins-r86hc","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p","openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk","openshift-network-node-identity/network-node-identity-rzmhl","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v","openshift-controller-manager/controller-manager-6f87d47d96-c24tv","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp","openshift-etcd/installer-1-master-0","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4","openshift-kube-controller-manager/kube-controller-manager-master-0","openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv","assisted-installer/assisted-installer-controller-p2nlp","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m","openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d","openshift-dns/dns-default-k8t84","openshift-kube-apiserver/installer-1-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p","openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv","openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll","openshift-marketplace/marketplace-operator-64bf9778cb-rgstx","openshift-multus/network-metrics-daemon-4m9jh","openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx","openshift-network-operator/iptables-alerter-xqmw9","openshift-apiserver/apiserver-7849849f76-86f2r","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h"] Mar 12 12:24:33.948756 master-0 kubenswrapper[13984]: I0312 12:24:33.948698 13984 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="62ebf93a-f9a1-4681-b086-18be545a70fd" Mar 12 12:24:33.953021 master-0 kubenswrapper[13984]: I0312 12:24:33.952983 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-p2nlp" Mar 12 12:24:33.953742 master-0 kubenswrapper[13984]: I0312 12:24:33.953713 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:33.953908 master-0 kubenswrapper[13984]: I0312 12:24:33.953839 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:33.953908 master-0 kubenswrapper[13984]: I0312 12:24:33.953877 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 12 12:24:33.954649 master-0 kubenswrapper[13984]: I0312 12:24:33.954616 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 12 12:24:33.957804 master-0 kubenswrapper[13984]: I0312 12:24:33.957769 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.959332 master-0 kubenswrapper[13984]: I0312 12:24:33.959257 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.959332 master-0 kubenswrapper[13984]: I0312 12:24:33.959327 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 12:24:33.962706 master-0 kubenswrapper[13984]: I0312 12:24:33.962656 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 12:24:33.963359 master-0 kubenswrapper[13984]: I0312 12:24:33.963322 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 12 12:24:33.963630 master-0 kubenswrapper[13984]: I0312 12:24:33.963590 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 12:24:33.964031 master-0 kubenswrapper[13984]: I0312 12:24:33.964005 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.966758 master-0 kubenswrapper[13984]: I0312 12:24:33.966711 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 12 12:24:33.966989 master-0 kubenswrapper[13984]: I0312 12:24:33.966958 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 12 12:24:33.967042 master-0 kubenswrapper[13984]: I0312 12:24:33.967008 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 12:24:33.967167 master-0 kubenswrapper[13984]: I0312 12:24:33.967131 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 12 12:24:33.967214 master-0 kubenswrapper[13984]: I0312 12:24:33.967203 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 12 12:24:33.967252 master-0 kubenswrapper[13984]: I0312 12:24:33.967223 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 12:24:33.967330 master-0 kubenswrapper[13984]: I0312 12:24:33.967296 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 12 12:24:33.967412 master-0 kubenswrapper[13984]: I0312 12:24:33.967230 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 12 12:24:33.967452 master-0 kubenswrapper[13984]: I0312 12:24:33.967406 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-srrbs" Mar 12 12:24:33.967613 master-0 kubenswrapper[13984]: I0312 12:24:33.967406 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 12:24:33.967653 master-0 kubenswrapper[13984]: I0312 12:24:33.967612 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 12:24:33.967738 master-0 kubenswrapper[13984]: I0312 12:24:33.967716 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 12 12:24:33.967775 master-0 kubenswrapper[13984]: I0312 12:24:33.967326 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 12:24:33.967842 master-0 kubenswrapper[13984]: I0312 12:24:33.967820 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 12:24:33.968413 master-0 kubenswrapper[13984]: I0312 12:24:33.968369 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 12 12:24:33.975706 master-0 kubenswrapper[13984]: I0312 12:24:33.975436 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 12:24:33.978636 master-0 kubenswrapper[13984]: I0312 12:24:33.978576 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 12:24:33.979044 master-0 kubenswrapper[13984]: I0312 12:24:33.978662 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 12:24:33.979418 master-0 kubenswrapper[13984]: I0312 12:24:33.979380 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 12 12:24:33.980353 master-0 kubenswrapper[13984]: I0312 12:24:33.979617 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 12:24:33.980422 master-0 kubenswrapper[13984]: I0312 12:24:33.980370 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 12:24:33.980559 master-0 kubenswrapper[13984]: I0312 12:24:33.980540 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 12 12:24:33.980704 master-0 kubenswrapper[13984]: I0312 12:24:33.980679 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.980803 master-0 kubenswrapper[13984]: I0312 12:24:33.980760 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 12:24:33.980959 master-0 kubenswrapper[13984]: I0312 12:24:33.980550 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.981555 master-0 kubenswrapper[13984]: I0312 12:24:33.981445 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 12:24:33.981618 master-0 kubenswrapper[13984]: I0312 12:24:33.981584 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:24:33.981618 master-0 kubenswrapper[13984]: I0312 12:24:33.980987 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 12:24:33.981707 master-0 kubenswrapper[13984]: I0312 12:24:33.981028 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 12:24:33.981763 master-0 kubenswrapper[13984]: I0312 12:24:33.981073 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 12 12:24:33.981827 master-0 kubenswrapper[13984]: I0312 12:24:33.981806 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 12:24:33.981827 master-0 kubenswrapper[13984]: I0312 12:24:33.981818 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 12:24:33.981927 master-0 kubenswrapper[13984]: I0312 12:24:33.981896 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 12:24:33.981927 master-0 kubenswrapper[13984]: I0312 12:24:33.981924 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.982027 master-0 kubenswrapper[13984]: I0312 12:24:33.981117 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 12:24:33.982220 master-0 kubenswrapper[13984]: I0312 12:24:33.981184 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 12:24:33.982272 master-0 kubenswrapper[13984]: I0312 12:24:33.982225 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 12:24:33.982324 master-0 kubenswrapper[13984]: I0312 12:24:33.981186 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 12 12:24:33.982441 master-0 kubenswrapper[13984]: I0312 12:24:33.981220 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.982621 master-0 kubenswrapper[13984]: I0312 12:24:33.982609 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 12:24:33.982678 master-0 kubenswrapper[13984]: I0312 12:24:33.981233 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-spkrx" Mar 12 12:24:33.982732 master-0 kubenswrapper[13984]: I0312 12:24:33.981263 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 12:24:33.982776 master-0 kubenswrapper[13984]: I0312 12:24:33.981268 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 12:24:33.982822 master-0 kubenswrapper[13984]: I0312 12:24:33.981294 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 12:24:33.982865 master-0 kubenswrapper[13984]: I0312 12:24:33.981324 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 12:24:33.982914 master-0 kubenswrapper[13984]: I0312 12:24:33.981345 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 12:24:33.982960 master-0 kubenswrapper[13984]: I0312 12:24:33.981350 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-drmbk" Mar 12 12:24:33.983003 master-0 kubenswrapper[13984]: I0312 12:24:33.981399 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 12 12:24:33.983377 master-0 kubenswrapper[13984]: I0312 12:24:33.983218 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:24:33.983472 master-0 kubenswrapper[13984]: I0312 12:24:33.983377 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.983472 master-0 kubenswrapper[13984]: I0312 12:24:33.983337 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 12:24:33.983577 master-0 kubenswrapper[13984]: I0312 12:24:33.983493 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 12:24:33.983823 master-0 kubenswrapper[13984]: I0312 12:24:33.983790 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 12:24:33.983890 master-0 kubenswrapper[13984]: I0312 12:24:33.983801 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 12 12:24:33.983925 master-0 kubenswrapper[13984]: I0312 12:24:33.983880 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 12:24:33.983975 master-0 kubenswrapper[13984]: I0312 12:24:33.983809 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 12:24:33.984114 master-0 kubenswrapper[13984]: I0312 12:24:33.984064 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 12 12:24:33.984279 master-0 kubenswrapper[13984]: I0312 12:24:33.984259 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 12:24:33.984329 master-0 kubenswrapper[13984]: I0312 12:24:33.984262 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 12:24:33.984329 master-0 kubenswrapper[13984]: I0312 12:24:33.984305 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 12:24:33.984386 master-0 kubenswrapper[13984]: I0312 12:24:33.984369 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 12 12:24:33.984415 master-0 kubenswrapper[13984]: I0312 12:24:33.984401 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 12:24:33.984440 master-0 kubenswrapper[13984]: I0312 12:24:33.984431 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 12:24:33.984501 master-0 kubenswrapper[13984]: I0312 12:24:33.984488 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 12:24:33.984542 master-0 kubenswrapper[13984]: I0312 12:24:33.984505 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 12:24:33.984542 master-0 kubenswrapper[13984]: I0312 12:24:33.984510 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 12:24:33.984617 master-0 kubenswrapper[13984]: I0312 12:24:33.984561 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 12:24:33.984617 master-0 kubenswrapper[13984]: I0312 12:24:33.984585 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 12:24:33.984617 master-0 kubenswrapper[13984]: I0312 12:24:33.984608 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 12:24:33.984701 master-0 kubenswrapper[13984]: I0312 12:24:33.984642 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 12:24:33.984701 master-0 kubenswrapper[13984]: I0312 12:24:33.984664 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 12:24:33.984701 master-0 kubenswrapper[13984]: I0312 12:24:33.984691 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 12:24:33.984785 master-0 kubenswrapper[13984]: I0312 12:24:33.984713 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.984785 master-0 kubenswrapper[13984]: I0312 12:24:33.984522 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 12:24:33.984785 master-0 kubenswrapper[13984]: I0312 12:24:33.984758 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 12:24:33.984902 master-0 kubenswrapper[13984]: I0312 12:24:33.984790 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 12:24:33.984902 master-0 kubenswrapper[13984]: I0312 12:24:33.984846 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 12:24:33.984902 master-0 kubenswrapper[13984]: I0312 12:24:33.984880 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 12:24:33.984988 master-0 kubenswrapper[13984]: I0312 12:24:33.984925 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 12 12:24:33.984988 master-0 kubenswrapper[13984]: I0312 12:24:33.984926 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 12:24:33.984988 master-0 kubenswrapper[13984]: I0312 12:24:33.984960 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-hjhdl" Mar 12 12:24:33.985060 master-0 kubenswrapper[13984]: I0312 12:24:33.985005 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 12 12:24:33.985060 master-0 kubenswrapper[13984]: I0312 12:24:33.984849 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 12:24:33.985060 master-0 kubenswrapper[13984]: I0312 12:24:33.985040 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 12:24:33.985140 master-0 kubenswrapper[13984]: I0312 12:24:33.985111 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-7skrh" Mar 12 12:24:33.985171 master-0 kubenswrapper[13984]: I0312 12:24:33.985140 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:24:33.985955 master-0 kubenswrapper[13984]: I0312 12:24:33.985193 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 12:24:33.985955 master-0 kubenswrapper[13984]: I0312 12:24:33.985362 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 12:24:33.985955 master-0 kubenswrapper[13984]: I0312 12:24:33.985489 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 12:24:33.987943 master-0 kubenswrapper[13984]: I0312 12:24:33.987906 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 12:24:33.988599 master-0 kubenswrapper[13984]: I0312 12:24:33.988575 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-qwrgw" Mar 12 12:24:33.988756 master-0 kubenswrapper[13984]: I0312 12:24:33.988570 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 12 12:24:33.989585 master-0 kubenswrapper[13984]: I0312 12:24:33.989556 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 12:24:33.990878 master-0 kubenswrapper[13984]: I0312 12:24:33.990849 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 12:24:33.997893 master-0 kubenswrapper[13984]: I0312 12:24:33.997848 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 12:24:34.016228 master-0 kubenswrapper[13984]: I0312 12:24:34.016181 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-pmfnh" Mar 12 12:24:34.036528 master-0 kubenswrapper[13984]: I0312 12:24:34.036449 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 12:24:34.041916 master-0 kubenswrapper[13984]: I0312 12:24:34.041890 13984 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 12 12:24:34.055701 master-0 kubenswrapper[13984]: I0312 12:24:34.055657 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 12:24:34.070332 master-0 kubenswrapper[13984]: I0312 12:24:34.070268 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.070528 master-0 kubenswrapper[13984]: I0312 12:24:34.070407 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.070586 master-0 kubenswrapper[13984]: I0312 12:24:34.070572 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-serving-cert\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.070650 master-0 kubenswrapper[13984]: I0312 12:24:34.070606 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.070698 master-0 kubenswrapper[13984]: I0312 12:24:34.070655 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:34.070759 master-0 kubenswrapper[13984]: I0312 12:24:34.070681 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.070852 master-0 kubenswrapper[13984]: I0312 12:24:34.070811 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.070930 master-0 kubenswrapper[13984]: I0312 12:24:34.070893 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:34.070971 master-0 kubenswrapper[13984]: I0312 12:24:34.070949 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.071002 master-0 kubenswrapper[13984]: I0312 12:24:34.070942 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.071036 master-0 kubenswrapper[13984]: I0312 12:24:34.071015 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36852fda-6aee-4a36-8724-537f1260c4c8-hosts-file\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:24:34.071067 master-0 kubenswrapper[13984]: I0312 12:24:34.071033 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-serving-cert\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.071184 master-0 kubenswrapper[13984]: I0312 12:24:34.071141 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.071218 master-0 kubenswrapper[13984]: I0312 12:24:34.071201 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-etcd-client\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.071259 master-0 kubenswrapper[13984]: I0312 12:24:34.071235 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/720101f1-0833-45af-a5b7-4910ece2a589-audit-dir\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.071289 master-0 kubenswrapper[13984]: I0312 12:24:34.071277 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.071332 master-0 kubenswrapper[13984]: I0312 12:24:34.071311 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:34.071367 master-0 kubenswrapper[13984]: I0312 12:24:34.071353 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.071400 master-0 kubenswrapper[13984]: I0312 12:24:34.071385 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86k82\" (UniqueName: \"kubernetes.io/projected/632651f7-6641-49d8-9c48-7f6ea5846538-kube-api-access-86k82\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:34.071456 master-0 kubenswrapper[13984]: I0312 12:24:34.071440 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.071507 master-0 kubenswrapper[13984]: I0312 12:24:34.071471 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-run\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.071572 master-0 kubenswrapper[13984]: I0312 12:24:34.071525 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:34.071572 master-0 kubenswrapper[13984]: I0312 12:24:34.071556 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.071710 master-0 kubenswrapper[13984]: I0312 12:24:34.071586 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/022dd526-0ea5-4224-9d2e-778ed4ef8a56-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.071710 master-0 kubenswrapper[13984]: I0312 12:24:34.071622 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.071710 master-0 kubenswrapper[13984]: I0312 12:24:34.071653 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvrpf\" (UniqueName: \"kubernetes.io/projected/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef-kube-api-access-wvrpf\") pod \"csi-snapshot-controller-7577d6f48-kf7kw\" (UID: \"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:24:34.071710 master-0 kubenswrapper[13984]: I0312 12:24:34.071687 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.071906 master-0 kubenswrapper[13984]: I0312 12:24:34.071717 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.071906 master-0 kubenswrapper[13984]: I0312 12:24:34.071750 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:34.071906 master-0 kubenswrapper[13984]: I0312 12:24:34.071785 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:24:34.071906 master-0 kubenswrapper[13984]: I0312 12:24:34.071816 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.071906 master-0 kubenswrapper[13984]: I0312 12:24:34.071850 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.071906 master-0 kubenswrapper[13984]: I0312 12:24:34.071881 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ns72p\" (UniqueName: \"kubernetes.io/projected/99a11fe6-48a1-439e-b788-158dbe267dcd-kube-api-access-ns72p\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.071912 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxrnf\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-kube-api-access-pxrnf\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.071948 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.071980 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.072009 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.072040 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.072068 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:34.072098 master-0 kubenswrapper[13984]: I0312 12:24:34.072097 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94src\" (UniqueName: \"kubernetes.io/projected/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-kube-api-access-94src\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.072307 master-0 kubenswrapper[13984]: I0312 12:24:34.072129 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.072307 master-0 kubenswrapper[13984]: I0312 12:24:34.072160 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-kubernetes\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.072307 master-0 kubenswrapper[13984]: I0312 12:24:34.072190 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-encryption-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.072307 master-0 kubenswrapper[13984]: I0312 12:24:34.072214 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.072307 master-0 kubenswrapper[13984]: I0312 12:24:34.072241 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.072307 master-0 kubenswrapper[13984]: I0312 12:24:34.072288 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.072516 master-0 kubenswrapper[13984]: I0312 12:24:34.072320 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-etcd-serving-ca\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.072516 master-0 kubenswrapper[13984]: I0312 12:24:34.072349 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-modprobe-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.072516 master-0 kubenswrapper[13984]: I0312 12:24:34.072394 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:24:34.072516 master-0 kubenswrapper[13984]: I0312 12:24:34.072423 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.072516 master-0 kubenswrapper[13984]: I0312 12:24:34.072449 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.072516 master-0 kubenswrapper[13984]: I0312 12:24:34.072499 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhwbb\" (UniqueName: \"kubernetes.io/projected/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-kube-api-access-qhwbb\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072528 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072572 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072597 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072603 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-etcd-client\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072624 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072656 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072689 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-image-import-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072710 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072751 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.072784 master-0 kubenswrapper[13984]: I0312 12:24:34.072774 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:24:34.073027 master-0 kubenswrapper[13984]: I0312 12:24:34.072799 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:34.073027 master-0 kubenswrapper[13984]: I0312 12:24:34.072825 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.073027 master-0 kubenswrapper[13984]: I0312 12:24:34.072868 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:34.073408 master-0 kubenswrapper[13984]: I0312 12:24:34.073374 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-config\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.073505 master-0 kubenswrapper[13984]: I0312 12:24:34.073469 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.074103 master-0 kubenswrapper[13984]: I0312 12:24:34.073668 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-encryption-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.074103 master-0 kubenswrapper[13984]: I0312 12:24:34.073755 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/022dd526-0ea5-4224-9d2e-778ed4ef8a56-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.074103 master-0 kubenswrapper[13984]: I0312 12:24:34.073966 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.074103 master-0 kubenswrapper[13984]: I0312 12:24:34.073995 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.074103 master-0 kubenswrapper[13984]: I0312 12:24:34.074020 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.074103 master-0 kubenswrapper[13984]: I0312 12:24:34.074043 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.074350 master-0 kubenswrapper[13984]: I0312 12:24:34.074102 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-image-import-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.074350 master-0 kubenswrapper[13984]: I0312 12:24:34.074149 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-cni-binary-copy\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.074350 master-0 kubenswrapper[13984]: I0312 12:24:34.074175 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-env-overrides\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.074350 master-0 kubenswrapper[13984]: I0312 12:24:34.073998 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-etcd-serving-ca\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.074350 master-0 kubenswrapper[13984]: I0312 12:24:34.074212 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.074350 master-0 kubenswrapper[13984]: I0312 12:24:34.074337 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074368 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074400 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074427 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074438 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074533 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54547\" (UniqueName: \"kubernetes.io/projected/36852fda-6aee-4a36-8724-537f1260c4c8-kube-api-access-54547\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074561 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074582 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074605 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074627 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074651 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lf74f\" (UniqueName: \"kubernetes.io/projected/580bafd6-af8c-4961-b959-b736a180e309-kube-api-access-lf74f\") pod \"migrator-57ccdf9b5-4rjrp\" (UID: \"580bafd6-af8c-4961-b959-b736a180e309\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074745 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074771 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074795 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.074825 master-0 kubenswrapper[13984]: I0312 12:24:34.074822 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.074860 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.074885 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/632651f7-6641-49d8-9c48-7f6ea5846538-cert\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.074906 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-serving-cert\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.074939 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wj8x\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-kube-api-access-6wj8x\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.074971 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075107 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5890f0c-cebe-4788-89f7-27568d875741-config\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075124 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysconfig\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075159 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075188 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075225 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5jgq\" (UniqueName: \"kubernetes.io/projected/81cb0504-9455-4398-aed1-5cc6790f292e-kube-api-access-j5jgq\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075259 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c62edaec-38e2-4b73-8bb5-c776abfb310f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075290 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075325 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:24:34.075330 master-0 kubenswrapper[13984]: I0312 12:24:34.075332 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-metrics-tls\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:24:34.075904 master-0 kubenswrapper[13984]: I0312 12:24:34.075356 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.075904 master-0 kubenswrapper[13984]: I0312 12:24:34.075390 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-trusted-ca-bundle\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.075904 master-0 kubenswrapper[13984]: I0312 12:24:34.075437 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:34.075904 master-0 kubenswrapper[13984]: I0312 12:24:34.075500 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cv9kn\" (UniqueName: \"kubernetes.io/projected/2bab9dba-235f-467c-9224-634cca9acbd2-kube-api-access-cv9kn\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.075904 master-0 kubenswrapper[13984]: I0312 12:24:34.075537 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:34.076187 master-0 kubenswrapper[13984]: I0312 12:24:34.075954 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68c57a64-f30c-4caf-89ef-08bd0d36833e-service-ca-bundle\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.076187 master-0 kubenswrapper[13984]: I0312 12:24:34.075954 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c62edaec-38e2-4b73-8bb5-c776abfb310f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:24:34.076187 master-0 kubenswrapper[13984]: I0312 12:24:34.075391 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-operand-assets\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:34.076187 master-0 kubenswrapper[13984]: I0312 12:24:34.076086 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.076187 master-0 kubenswrapper[13984]: I0312 12:24:34.076124 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:34.076187 master-0 kubenswrapper[13984]: I0312 12:24:34.076159 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.076541 master-0 kubenswrapper[13984]: I0312 12:24:34.076195 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:34.076541 master-0 kubenswrapper[13984]: I0312 12:24:34.076228 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-encryption-config\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.076541 master-0 kubenswrapper[13984]: I0312 12:24:34.076261 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.076541 master-0 kubenswrapper[13984]: I0312 12:24:34.076347 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-serving-cert\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.076707 master-0 kubenswrapper[13984]: I0312 12:24:34.076556 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:34.076707 master-0 kubenswrapper[13984]: I0312 12:24:34.076592 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:34.076707 master-0 kubenswrapper[13984]: I0312 12:24:34.076616 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-k9zjm" Mar 12 12:24:34.076707 master-0 kubenswrapper[13984]: I0312 12:24:34.076638 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e64bc838-280e-4231-9732-1adb69fed0bc-metrics-certs\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:24:34.076866 master-0 kubenswrapper[13984]: I0312 12:24:34.076619 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.076866 master-0 kubenswrapper[13984]: I0312 12:24:34.076796 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.076866 master-0 kubenswrapper[13984]: I0312 12:24:34.076828 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7tct\" (UniqueName: \"kubernetes.io/projected/15bf86d9-62b3-4af8-b6f6-23131d712332-kube-api-access-g7tct\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:24:34.077003 master-0 kubenswrapper[13984]: I0312 12:24:34.076876 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:24:34.077067 master-0 kubenswrapper[13984]: I0312 12:24:34.077045 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-trusted-ca-bundle\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.077132 master-0 kubenswrapper[13984]: I0312 12:24:34.077087 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:24:34.077132 master-0 kubenswrapper[13984]: I0312 12:24:34.077114 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-lib-modules\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.077244 master-0 kubenswrapper[13984]: I0312 12:24:34.077136 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.077244 master-0 kubenswrapper[13984]: I0312 12:24:34.077162 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.077244 master-0 kubenswrapper[13984]: I0312 12:24:34.077186 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.077244 master-0 kubenswrapper[13984]: I0312 12:24:34.077209 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-sys\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.077244 master-0 kubenswrapper[13984]: I0312 12:24:34.077233 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:34.077704 master-0 kubenswrapper[13984]: I0312 12:24:34.077259 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.077704 master-0 kubenswrapper[13984]: I0312 12:24:34.077473 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/720101f1-0833-45af-a5b7-4910ece2a589-encryption-config\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.077857 master-0 kubenswrapper[13984]: I0312 12:24:34.077722 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:24:34.077857 master-0 kubenswrapper[13984]: I0312 12:24:34.077782 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/81cb0504-9455-4398-aed1-5cc6790f292e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.077857 master-0 kubenswrapper[13984]: I0312 12:24:34.077786 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:24:34.077857 master-0 kubenswrapper[13984]: I0312 12:24:34.077832 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-serving-cert\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.078025 master-0 kubenswrapper[13984]: I0312 12:24:34.077882 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:34.078025 master-0 kubenswrapper[13984]: I0312 12:24:34.077913 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.078025 master-0 kubenswrapper[13984]: I0312 12:24:34.077971 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:34.078025 master-0 kubenswrapper[13984]: I0312 12:24:34.078001 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.078186 master-0 kubenswrapper[13984]: I0312 12:24:34.078166 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.078300 master-0 kubenswrapper[13984]: I0312 12:24:34.078248 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-client\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.078361 master-0 kubenswrapper[13984]: I0312 12:24:34.078330 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.078420 master-0 kubenswrapper[13984]: I0312 12:24:34.078371 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-config\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.078463 master-0 kubenswrapper[13984]: I0312 12:24:34.078439 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078495 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078572 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078640 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078674 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-tuned\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078742 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d47f860-d64a-49b8-b404-a67cbc2faeb6-metrics-tls\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078774 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit-dir\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078824 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078856 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-images\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078887 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njx9l\" (UniqueName: \"kubernetes.io/projected/9d47f860-d64a-49b8-b404-a67cbc2faeb6-kube-api-access-njx9l\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078916 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.078919 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.079038 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.079119 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-tuned\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.079122 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/81cb0504-9455-4398-aed1-5cc6790f292e-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.079189 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a11fe6-48a1-439e-b788-158dbe267dcd-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.079225 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.079330 master-0 kubenswrapper[13984]: I0312 12:24:34.079258 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079648 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/99a11fe6-48a1-439e-b788-158dbe267dcd-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079700 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-config\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079733 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079766 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079852 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-var-lib-kubelet\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079897 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079935 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.079972 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080009 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080042 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080157 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080195 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080229 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080264 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080299 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080337 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68c57a64-f30c-4caf-89ef-08bd0d36833e-serving-cert\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080377 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080419 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080454 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080508 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080544 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080710 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-config\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080743 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080785 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080819 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080851 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.081032 master-0 kubenswrapper[13984]: I0312 12:24:34.080885 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.081249 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/68c57a64-f30c-4caf-89ef-08bd0d36833e-serving-cert\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.081422 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/51d58450-50bb-4da0-b1f6-4135fbabd856-webhook-cert\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.081429 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/81cb0504-9455-4398-aed1-5cc6790f292e-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.081723 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.081129 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.081840 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.082177 master-0 kubenswrapper[13984]: I0312 12:24:34.082021 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.082450 master-0 kubenswrapper[13984]: I0312 12:24:34.082396 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5890f0c-cebe-4788-89f7-27568d875741-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:24:34.085643 master-0 kubenswrapper[13984]: I0312 12:24:34.085474 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.085877 master-0 kubenswrapper[13984]: I0312 12:24:34.085839 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.085956 master-0 kubenswrapper[13984]: I0312 12:24:34.085935 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:24:34.086002 master-0 kubenswrapper[13984]: I0312 12:24:34.085969 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.086047 master-0 kubenswrapper[13984]: I0312 12:24:34.086017 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.086099 master-0 kubenswrapper[13984]: I0312 12:24:34.086046 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4nbb\" (UniqueName: \"kubernetes.io/projected/68c57a64-f30c-4caf-89ef-08bd0d36833e-kube-api-access-x4nbb\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.086099 master-0 kubenswrapper[13984]: I0312 12:24:34.086084 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-tmp\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.086316 master-0 kubenswrapper[13984]: I0312 12:24:34.086283 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:34.086378 master-0 kubenswrapper[13984]: I0312 12:24:34.086358 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-tmp\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.086427 master-0 kubenswrapper[13984]: I0312 12:24:34.086410 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcb5s\" (UniqueName: \"kubernetes.io/projected/720101f1-0833-45af-a5b7-4910ece2a589-kube-api-access-dcb5s\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.086579 master-0 kubenswrapper[13984]: I0312 12:24:34.086555 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-images\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.086629 master-0 kubenswrapper[13984]: I0312 12:24:34.086572 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8a121d0d-d201-446b-97a1-e2414e599f4a-config\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:24:34.086629 master-0 kubenswrapper[13984]: I0312 12:24:34.086589 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:24:34.086701 master-0 kubenswrapper[13984]: I0312 12:24:34.086643 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-conf\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.086701 master-0 kubenswrapper[13984]: I0312 12:24:34.086687 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s28gq\" (UniqueName: \"kubernetes.io/projected/f19c3c89-8d32-4394-bd86-e5ef7734c42b-kube-api-access-s28gq\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:34.086789 master-0 kubenswrapper[13984]: I0312 12:24:34.086739 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:34.086789 master-0 kubenswrapper[13984]: I0312 12:24:34.086785 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.086868 master-0 kubenswrapper[13984]: I0312 12:24:34.086807 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.086868 master-0 kubenswrapper[13984]: I0312 12:24:34.086851 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:34.086950 master-0 kubenswrapper[13984]: I0312 12:24:34.086879 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:34.086950 master-0 kubenswrapper[13984]: I0312 12:24:34.086906 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b704e7-1291-4645-8a0d-2a937829d7ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:34.086950 master-0 kubenswrapper[13984]: I0312 12:24:34.086939 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.087056 master-0 kubenswrapper[13984]: I0312 12:24:34.086964 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.087104 master-0 kubenswrapper[13984]: I0312 12:24:34.087068 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.087104 master-0 kubenswrapper[13984]: I0312 12:24:34.087097 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-client\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.087182 master-0 kubenswrapper[13984]: I0312 12:24:34.087121 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:34.087182 master-0 kubenswrapper[13984]: I0312 12:24:34.087146 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zct7x\" (UniqueName: \"kubernetes.io/projected/f3b704e7-1291-4645-8a0d-2a937829d7ac-kube-api-access-zct7x\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:34.087182 master-0 kubenswrapper[13984]: I0312 12:24:34.087173 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:24:34.087291 master-0 kubenswrapper[13984]: I0312 12:24:34.087198 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:24:34.087291 master-0 kubenswrapper[13984]: I0312 12:24:34.087224 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:34.087539 master-0 kubenswrapper[13984]: I0312 12:24:34.087521 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.087591 master-0 kubenswrapper[13984]: I0312 12:24:34.087555 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bab9dba-235f-467c-9224-634cca9acbd2-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.087700 master-0 kubenswrapper[13984]: I0312 12:24:34.087682 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/81cb0504-9455-4398-aed1-5cc6790f292e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.087749 master-0 kubenswrapper[13984]: I0312 12:24:34.087713 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-host\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.087749 master-0 kubenswrapper[13984]: I0312 12:24:34.087735 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.087822 master-0 kubenswrapper[13984]: I0312 12:24:34.087763 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.087822 master-0 kubenswrapper[13984]: I0312 12:24:34.087787 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:24:34.087822 master-0 kubenswrapper[13984]: I0312 12:24:34.087816 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.087930 master-0 kubenswrapper[13984]: I0312 12:24:34.087840 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:34.087930 master-0 kubenswrapper[13984]: I0312 12:24:34.087863 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.087930 master-0 kubenswrapper[13984]: I0312 12:24:34.087886 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/51d58450-50bb-4da0-b1f6-4135fbabd856-ovnkube-identity-cm\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:34.087930 master-0 kubenswrapper[13984]: I0312 12:24:34.087887 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:34.088081 master-0 kubenswrapper[13984]: I0312 12:24:34.087932 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-key\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:24:34.088081 master-0 kubenswrapper[13984]: I0312 12:24:34.087964 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.088081 master-0 kubenswrapper[13984]: I0312 12:24:34.087993 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdfnh\" (UniqueName: \"kubernetes.io/projected/c62edaec-38e2-4b73-8bb5-c776abfb310f-kube-api-access-cdfnh\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:24:34.088081 master-0 kubenswrapper[13984]: I0312 12:24:34.088020 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-trusted-ca-bundle\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.088081 master-0 kubenswrapper[13984]: I0312 12:24:34.088044 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:24:34.088081 master-0 kubenswrapper[13984]: I0312 12:24:34.088069 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-images\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.088314 master-0 kubenswrapper[13984]: I0312 12:24:34.088135 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-client\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.088375 master-0 kubenswrapper[13984]: I0312 12:24:34.088355 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a154f648-b96d-449e-b0f5-ba32266000c2-serving-cert\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:34.088421 master-0 kubenswrapper[13984]: I0312 12:24:34.088360 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.088715 master-0 kubenswrapper[13984]: I0312 12:24:34.088639 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8a121d0d-d201-446b-97a1-e2414e599f4a-serving-cert\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:24:34.088838 master-0 kubenswrapper[13984]: I0312 12:24:34.088810 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3b704e7-1291-4645-8a0d-2a937829d7ac-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:34.088888 master-0 kubenswrapper[13984]: I0312 12:24:34.088851 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:24:34.088943 master-0 kubenswrapper[13984]: I0312 12:24:34.087685 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/99a11fe6-48a1-439e-b788-158dbe267dcd-images\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:34.089129 master-0 kubenswrapper[13984]: I0312 12:24:34.089109 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:24:34.089129 master-0 kubenswrapper[13984]: I0312 12:24:34.089104 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-key\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:24:34.089344 master-0 kubenswrapper[13984]: I0312 12:24:34.089325 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/ae2269d7-f11f-46d1-95e7-f89a70ee1152-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089505 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/ab087440-bdf2-4e2f-9a5a-434d50a2329a-etcd-ca\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089641 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089693 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-trusted-ca-bundle\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089713 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-node-pullsecrets\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089753 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089799 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089828 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd4jz\" (UniqueName: \"kubernetes.io/projected/02d5a507-4409-44b4-98bc-1751cdcc6c6a-kube-api-access-jd4jz\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089859 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.089863 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/15bf86d9-62b3-4af8-b6f6-23131d712332-signing-cabundle\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090100 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/ae2269d7-f11f-46d1-95e7-f89a70ee1152-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090132 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090158 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090189 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090209 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090223 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/02d5a507-4409-44b4-98bc-1751cdcc6c6a-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090264 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090285 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68c57a64-f30c-4caf-89ef-08bd0d36833e-snapshots\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090302 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-systemd\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090321 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090342 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090360 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090385 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090413 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090438 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090459 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090582 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/68c57a64-f30c-4caf-89ef-08bd0d36833e-snapshots\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090595 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090635 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090671 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a154f648-b96d-449e-b0f5-ba32266000c2-available-featuregates\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090695 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090759 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgxbv\" (UniqueName: \"kubernetes.io/projected/c873b656-d2aa-4d0e-aa22-9f8d35186473-kube-api-access-kgxbv\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090845 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-binary-copy\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090906 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb249\" (UniqueName: \"kubernetes.io/projected/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-kube-api-access-tb249\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090950 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.090952 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091010 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-audit-policies\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091125 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091176 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091211 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-serving-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091348 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091381 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/720101f1-0833-45af-a5b7-4910ece2a589-audit-policies\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091618 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091674 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/c873b656-d2aa-4d0e-aa22-9f8d35186473-etcd-serving-ca\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091686 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-daemon-config\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.091721 master-0 kubenswrapper[13984]: I0312 12:24:34.091694 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.095824 master-0 kubenswrapper[13984]: I0312 12:24:34.095675 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 12:24:34.100578 master-0 kubenswrapper[13984]: I0312 12:24:34.100535 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.116236 master-0 kubenswrapper[13984]: I0312 12:24:34.116195 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 12:24:34.121220 master-0 kubenswrapper[13984]: I0312 12:24:34.121173 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.136924 master-0 kubenswrapper[13984]: I0312 12:24:34.136878 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 12:24:34.144041 master-0 kubenswrapper[13984]: I0312 12:24:34.143987 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-config\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:34.155998 master-0 kubenswrapper[13984]: I0312 12:24:34.155938 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 12:24:34.159575 master-0 kubenswrapper[13984]: I0312 12:24:34.159510 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a346ac54-02fe-417f-a49d-038e45b13a1d-serving-cert\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.184159 master-0 kubenswrapper[13984]: I0312 12:24:34.183918 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 12:24:34.184652 master-0 kubenswrapper[13984]: I0312 12:24:34.184605 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.193451 master-0 kubenswrapper[13984]: I0312 12:24:34.193390 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.193652 master-0 kubenswrapper[13984]: I0312 12:24:34.193469 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.193652 master-0 kubenswrapper[13984]: I0312 12:24:34.193534 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-netns\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.193652 master-0 kubenswrapper[13984]: I0312 12:24:34.193541 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-cnibin\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.193652 master-0 kubenswrapper[13984]: I0312 12:24:34.193584 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:34.193652 master-0 kubenswrapper[13984]: I0312 12:24:34.193638 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.193784 master-0 kubenswrapper[13984]: I0312 12:24:34.193677 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-kubelet\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.193784 master-0 kubenswrapper[13984]: I0312 12:24:34.193710 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.193784 master-0 kubenswrapper[13984]: I0312 12:24:34.193743 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.193784 master-0 kubenswrapper[13984]: I0312 12:24:34.193751 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/54612733-158f-4a92-a1bf-f4a8d653ffaf-host-slash\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:34.193881 master-0 kubenswrapper[13984]: I0312 12:24:34.193791 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-bin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.193881 master-0 kubenswrapper[13984]: I0312 12:24:34.193803 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-lib-modules\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.193881 master-0 kubenswrapper[13984]: I0312 12:24:34.193857 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-tuning-conf-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.193959 master-0 kubenswrapper[13984]: I0312 12:24:34.193885 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.193959 master-0 kubenswrapper[13984]: I0312 12:24:34.193916 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-sys\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.193959 master-0 kubenswrapper[13984]: I0312 12:24:34.193925 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-lib-modules\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.194037 master-0 kubenswrapper[13984]: I0312 12:24:34.193967 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-bin\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.194037 master-0 kubenswrapper[13984]: I0312 12:24:34.193979 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-sys\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.194037 master-0 kubenswrapper[13984]: I0312 12:24:34.194016 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.194135 master-0 kubenswrapper[13984]: I0312 12:24:34.194069 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit-dir\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.194135 master-0 kubenswrapper[13984]: I0312 12:24:34.194098 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.194135 master-0 kubenswrapper[13984]: I0312 12:24:34.194125 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194142 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-var-lib-kubelet\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194161 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194191 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194437 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194506 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-k8s-cni-cncf-io\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194539 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-audit-dir\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194578 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-netns\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194587 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-var-lib-kubelet\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194624 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-run-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194667 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194698 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194705 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194730 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-run-multus-certs\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194759 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194789 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194837 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-conf\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194862 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-host-var-lib-cni-multus\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194867 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194870 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194907 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/61ab511b-72e9-4fb9-b5de-770f49514369-host-etc-kube\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194909 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194930 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.194939 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.195000 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.195073 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-var-lib-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.195097 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.195129 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/81cb0504-9455-4398-aed1-5cc6790f292e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.195137 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-conf\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.195620 master-0 kubenswrapper[13984]: I0312 12:24:34.195161 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.195715 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/81cb0504-9455-4398-aed1-5cc6790f292e-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.195891 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.195917 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-log-socket\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.195936 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-host\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196068 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-host\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196217 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196237 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196359 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-node-pullsecrets\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196367 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196423 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196459 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196503 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-systemd\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196539 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196568 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-systemd-units\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196603 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196625 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/c873b656-d2aa-4d0e-aa22-9f8d35186473-node-pullsecrets\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196629 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196666 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196684 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-os-release\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196731 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196767 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-systemd\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196833 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-system-cni-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.196889 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-os-release\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.197319 master-0 kubenswrapper[13984]: I0312 12:24:34.197024 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-etc-kubernetes\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.197783 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.197824 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.197875 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.197913 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198043 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36852fda-6aee-4a36-8724-537f1260c4c8-hosts-file\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198071 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198268 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198316 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-ovn\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198359 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/36852fda-6aee-4a36-8724-537f1260c4c8-hosts-file\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198365 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/720101f1-0833-45af-a5b7-4910ece2a589-audit-dir\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198401 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-socket-dir-parent\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198407 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198437 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/720101f1-0833-45af-a5b7-4910ece2a589-audit-dir\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198530 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198560 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-run\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.198624 master-0 kubenswrapper[13984]: I0312 12:24:34.198611 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198654 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198817 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-kubernetes\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198839 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-etc-openvswitch\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198857 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-run\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198887 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198890 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-kubernetes\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198918 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-multus-conf-dir\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198930 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-modprobe-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198921 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysctl-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.198982 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199028 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-modprobe-d\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199046 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199084 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199135 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199179 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199214 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199238 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199263 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199286 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199308 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199435 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199507 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysconfig\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199532 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199633 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-slash\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199658 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-cnibin\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199668 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/666857a1-0ddf-4b48-91f4-44cce154d1b1-hostroot\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199741 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-etc-sysconfig\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199766 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-run-systemd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199789 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10498208-0692-4533-b672-a7a2cfcdf1be-system-cni-dir\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199812 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-cni-netd\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.199835 master-0 kubenswrapper[13984]: I0312 12:24:34.199833 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-host-kubelet\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.200808 master-0 kubenswrapper[13984]: I0312 12:24:34.199927 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/022dd526-0ea5-4224-9d2e-778ed4ef8a56-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.200808 master-0 kubenswrapper[13984]: I0312 12:24:34.199956 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-node-log\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.200808 master-0 kubenswrapper[13984]: I0312 12:24:34.199978 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:34.200808 master-0 kubenswrapper[13984]: I0312 12:24:34.200198 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-images\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.220006 master-0 kubenswrapper[13984]: I0312 12:24:34.217610 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 12:24:34.226429 master-0 kubenswrapper[13984]: I0312 12:24:34.225207 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:34.226429 master-0 kubenswrapper[13984]: I0312 12:24:34.225358 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.226429 master-0 kubenswrapper[13984]: I0312 12:24:34.225634 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.232328 master-0 kubenswrapper[13984]: I0312 12:24:34.232294 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:34.236145 master-0 kubenswrapper[13984]: I0312 12:24:34.235984 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 12:24:34.236638 master-0 kubenswrapper[13984]: I0312 12:24:34.236462 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:34.255915 master-0 kubenswrapper[13984]: I0312 12:24:34.255868 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 12 12:24:34.261559 master-0 kubenswrapper[13984]: I0312 12:24:34.261516 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.278938 master-0 kubenswrapper[13984]: I0312 12:24:34.278880 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 12:24:34.281462 master-0 kubenswrapper[13984]: I0312 12:24:34.281416 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-env-overrides\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.283756 master-0 kubenswrapper[13984]: I0312 12:24:34.283712 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.295719 master-0 kubenswrapper[13984]: I0312 12:24:34.295630 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 12:24:34.300400 master-0 kubenswrapper[13984]: I0312 12:24:34.300322 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") pod \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " Mar 12 12:24:34.300400 master-0 kubenswrapper[13984]: I0312 12:24:34.300381 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") pod \"a97fcd56-aa52-414a-b370-154c1b34c1ed\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " Mar 12 12:24:34.300608 master-0 kubenswrapper[13984]: I0312 12:24:34.300450 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") pod \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " Mar 12 12:24:34.300608 master-0 kubenswrapper[13984]: I0312 12:24:34.300445 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock" (OuterVolumeSpecName: "var-lock") pod "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:34.300608 master-0 kubenswrapper[13984]: I0312 12:24:34.300529 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a97fcd56-aa52-414a-b370-154c1b34c1ed" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:34.300608 master-0 kubenswrapper[13984]: I0312 12:24:34.300542 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") pod \"a97fcd56-aa52-414a-b370-154c1b34c1ed\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " Mar 12 12:24:34.300608 master-0 kubenswrapper[13984]: I0312 12:24:34.300556 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:34.301019 master-0 kubenswrapper[13984]: I0312 12:24:34.300654 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "a97fcd56-aa52-414a-b370-154c1b34c1ed" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:34.301638 master-0 kubenswrapper[13984]: I0312 12:24:34.301593 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:34.301638 master-0 kubenswrapper[13984]: I0312 12:24:34.301619 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:34.301638 master-0 kubenswrapper[13984]: I0312 12:24:34.301631 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:34.301638 master-0 kubenswrapper[13984]: I0312 12:24:34.301643 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a97fcd56-aa52-414a-b370-154c1b34c1ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:34.316203 master-0 kubenswrapper[13984]: I0312 12:24:34.316157 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 12:24:34.336812 master-0 kubenswrapper[13984]: I0312 12:24:34.336750 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 12:24:34.339416 master-0 kubenswrapper[13984]: I0312 12:24:34.339357 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-config\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.339948 master-0 kubenswrapper[13984]: I0312 12:24:34.339902 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.355321 master-0 kubenswrapper[13984]: I0312 12:24:34.355260 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 12:24:34.376076 master-0 kubenswrapper[13984]: I0312 12:24:34.375995 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 12:24:34.379178 master-0 kubenswrapper[13984]: I0312 12:24:34.379137 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovnkube-script-lib\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.396196 master-0 kubenswrapper[13984]: I0312 12:24:34.396150 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 12:24:34.416891 master-0 kubenswrapper[13984]: I0312 12:24:34.416798 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 12:24:34.424893 master-0 kubenswrapper[13984]: I0312 12:24:34.424838 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-ovn-node-metrics-cert\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:34.437221 master-0 kubenswrapper[13984]: I0312 12:24:34.437069 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 12:24:34.441599 master-0 kubenswrapper[13984]: I0312 12:24:34.441531 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a346ac54-02fe-417f-a49d-038e45b13a1d-config\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:34.456810 master-0 kubenswrapper[13984]: I0312 12:24:34.456732 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 12:24:34.477004 master-0 kubenswrapper[13984]: I0312 12:24:34.476946 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 12:24:34.479157 master-0 kubenswrapper[13984]: I0312 12:24:34.479098 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/54612733-158f-4a92-a1bf-f4a8d653ffaf-iptables-alerter-script\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:34.497565 master-0 kubenswrapper[13984]: I0312 12:24:34.497451 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 12 12:24:34.502281 master-0 kubenswrapper[13984]: I0312 12:24:34.502230 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-config\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.516995 master-0 kubenswrapper[13984]: I0312 12:24:34.516907 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 12:24:34.536381 master-0 kubenswrapper[13984]: I0312 12:24:34.536298 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 12:24:34.539556 master-0 kubenswrapper[13984]: I0312 12:24:34.539508 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/2bab9dba-235f-467c-9224-634cca9acbd2-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.556636 master-0 kubenswrapper[13984]: I0312 12:24:34.556551 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 12 12:24:34.566281 master-0 kubenswrapper[13984]: I0312 12:24:34.566240 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:34.576062 master-0 kubenswrapper[13984]: I0312 12:24:34.576014 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dbn77" Mar 12 12:24:34.606305 master-0 kubenswrapper[13984]: I0312 12:24:34.606221 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 12:24:34.612981 master-0 kubenswrapper[13984]: I0312 12:24:34.612924 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a22189f2-3f35-4ea6-9892-39a1b46637e2-trusted-ca\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:34.617235 master-0 kubenswrapper[13984]: I0312 12:24:34.617159 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 12:24:34.619762 master-0 kubenswrapper[13984]: I0312 12:24:34.619654 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-images\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.636724 master-0 kubenswrapper[13984]: I0312 12:24:34.636649 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 12:24:34.656551 master-0 kubenswrapper[13984]: I0312 12:24:34.656461 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 12:24:34.659652 master-0 kubenswrapper[13984]: I0312 12:24:34.659584 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bab9dba-235f-467c-9224-634cca9acbd2-config\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:34.676051 master-0 kubenswrapper[13984]: I0312 12:24:34.675975 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 12:24:34.697203 master-0 kubenswrapper[13984]: I0312 12:24:34.697093 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 12:24:34.706977 master-0 kubenswrapper[13984]: I0312 12:24:34.706898 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/55bf535c-93ab-4870-a9d2-c02496d71ef0-config\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:24:34.709076 master-0 kubenswrapper[13984]: I0312 12:24:34.709022 13984 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 12 12:24:34.712473 master-0 kubenswrapper[13984]: I0312 12:24:34.712418 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 12 12:24:34.712581 master-0 kubenswrapper[13984]: I0312 12:24:34.712492 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 12 12:24:34.712581 master-0 kubenswrapper[13984]: I0312 12:24:34.712520 13984 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 12 12:24:34.712895 master-0 kubenswrapper[13984]: I0312 12:24:34.712859 13984 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 12 12:24:34.717248 master-0 kubenswrapper[13984]: I0312 12:24:34.717208 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 12:24:34.737085 master-0 kubenswrapper[13984]: I0312 12:24:34.737019 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 12 12:24:34.738379 master-0 kubenswrapper[13984]: I0312 12:24:34.738330 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/022dd526-0ea5-4224-9d2e-778ed4ef8a56-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.756330 master-0 kubenswrapper[13984]: I0312 12:24:34.756269 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 12:24:34.765266 master-0 kubenswrapper[13984]: I0312 12:24:34.765179 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a22189f2-3f35-4ea6-9892-39a1b46637e2-metrics-tls\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:34.788966 master-0 kubenswrapper[13984]: I0312 12:24:34.788887 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 12 12:24:34.797932 master-0 kubenswrapper[13984]: I0312 12:24:34.797355 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 12 12:24:34.806270 master-0 kubenswrapper[13984]: I0312 12:24:34.806193 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:34.816258 master-0 kubenswrapper[13984]: I0312 12:24:34.815974 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-dk5lp" Mar 12 12:24:34.836524 master-0 kubenswrapper[13984]: I0312 12:24:34.836367 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 12 12:24:34.858595 master-0 kubenswrapper[13984]: I0312 12:24:34.857053 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 12:24:34.862639 master-0 kubenswrapper[13984]: I0312 12:24:34.862553 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/55bf535c-93ab-4870-a9d2-c02496d71ef0-serving-cert\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:24:34.882593 master-0 kubenswrapper[13984]: I0312 12:24:34.879301 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 12 12:24:34.882593 master-0 kubenswrapper[13984]: I0312 12:24:34.880629 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/632651f7-6641-49d8-9c48-7f6ea5846538-cert\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:34.908548 master-0 kubenswrapper[13984]: I0312 12:24:34.908448 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 12 12:24:34.916590 master-0 kubenswrapper[13984]: I0312 12:24:34.916520 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 12 12:24:34.937248 master-0 kubenswrapper[13984]: I0312 12:24:34.937197 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:24:34.945762 master-0 kubenswrapper[13984]: I0312 12:24:34.945725 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:34.956269 master-0 kubenswrapper[13984]: I0312 12:24:34.956153 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 12 12:24:34.960510 master-0 kubenswrapper[13984]: I0312 12:24:34.960420 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:34.974205 master-0 kubenswrapper[13984]: I0312 12:24:34.974169 13984 request.go:700] Waited for 1.015224839s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ovn-kubernetes/secrets?fieldSelector=metadata.name%3Dovn-control-plane-metrics-cert&limit=500&resourceVersion=0 Mar 12 12:24:34.976325 master-0 kubenswrapper[13984]: I0312 12:24:34.976294 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 12:24:34.987200 master-0 kubenswrapper[13984]: I0312 12:24:34.987120 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f3a291a-d9af-4e0f-a307-8928e4dc523d-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:34.996066 master-0 kubenswrapper[13984]: I0312 12:24:34.996009 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:24:35.024956 master-0 kubenswrapper[13984]: I0312 12:24:35.024885 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:24:35.035580 master-0 kubenswrapper[13984]: I0312 12:24:35.034812 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:35.036292 master-0 kubenswrapper[13984]: I0312 12:24:35.036236 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:24:35.038422 master-0 kubenswrapper[13984]: I0312 12:24:35.038366 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:35.057251 master-0 kubenswrapper[13984]: I0312 12:24:35.057196 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 12:24:35.060465 master-0 kubenswrapper[13984]: I0312 12:24:35.060421 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d47f860-d64a-49b8-b404-a67cbc2faeb6-metrics-tls\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:35.073172 master-0 kubenswrapper[13984]: E0312 12:24:35.073106 13984 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.073334 master-0 kubenswrapper[13984]: E0312 12:24:35.073286 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert podName:d1d16bbc-778b-4fc1-abb2-b43e79a7c532 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.573244368 +0000 UTC m=+7.771259910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-tcc85" (UID: "d1d16bbc-778b-4fc1-abb2-b43e79a7c532") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.073862 master-0 kubenswrapper[13984]: E0312 12:24:35.073809 13984 secret.go:189] Couldn't get secret openshift-network-operator/metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.073954 master-0 kubenswrapper[13984]: E0312 12:24:35.073906 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls podName:61ab511b-72e9-4fb9-b5de-770f49514369 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.573878693 +0000 UTC m=+7.771894235 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls") pod "network-operator-7c649bf6d4-rbb5m" (UID: "61ab511b-72e9-4fb9-b5de-770f49514369") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.074027 master-0 kubenswrapper[13984]: E0312 12:24:35.074008 13984 configmap.go:193] Couldn't get configMap openshift-multus/whereabouts-config: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.074111 master-0 kubenswrapper[13984]: E0312 12:24:35.074080 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap podName:10498208-0692-4533-b672-a7a2cfcdf1be nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.57406092 +0000 UTC m=+7.772076462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "whereabouts-configmap" (UniqueName: "kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap") pod "multus-additional-cni-plugins-r86hc" (UID: "10498208-0692-4533-b672-a7a2cfcdf1be") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.074292 master-0 kubenswrapper[13984]: E0312 12:24:35.074253 13984 configmap.go:193] Couldn't get configMap openshift-apiserver-operator/openshift-apiserver-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.074379 master-0 kubenswrapper[13984]: E0312 12:24:35.074351 13984 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.074442 master-0 kubenswrapper[13984]: E0312 12:24:35.074381 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config podName:ea80247e-b4dd-45dc-8255-6e68508c8480 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.574351852 +0000 UTC m=+7.772367374 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config") pod "openshift-apiserver-operator-799b6db4d7-gc2gv" (UID: "ea80247e-b4dd-45dc-8255-6e68508c8480") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.074442 master-0 kubenswrapper[13984]: E0312 12:24:35.074432 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca podName:ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.574412044 +0000 UTC m=+7.772427636 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca") pod "controller-manager-6f87d47d96-c24tv" (UID: "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: E0312 12:24:35.075723 13984 secret.go:189] Couldn't get secret openshift-apiserver-operator/openshift-apiserver-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: E0312 12:24:35.075827 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert podName:ea80247e-b4dd-45dc-8255-6e68508c8480 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.575797968 +0000 UTC m=+7.773813500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert") pod "openshift-apiserver-operator-799b6db4d7-gc2gv" (UID: "ea80247e-b4dd-45dc-8255-6e68508c8480") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: E0312 12:24:35.075912 13984 configmap.go:193] Couldn't get configMap openshift-kube-controller-manager-operator/kube-controller-manager-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: E0312 12:24:35.075968 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config podName:6571f5e5-07ee-4e6c-a8ad-277bc52e35ee nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.575951315 +0000 UTC m=+7.773966857 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config") pod "kube-controller-manager-operator-86d7cdfdfb-d4htx" (UID: "6571f5e5-07ee-4e6c-a8ad-277bc52e35ee") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: E0312 12:24:35.076003 13984 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: E0312 12:24:35.076056 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert podName:d961a5f0-84b7-47d7-846b-238475947121 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.576039308 +0000 UTC m=+7.774054920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert") pod "catalog-operator-7d9c49f57b-nwk7v" (UID: "d961a5f0-84b7-47d7-846b-238475947121") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.076394 master-0 kubenswrapper[13984]: I0312 12:24:35.076377 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:24:35.076909 master-0 kubenswrapper[13984]: E0312 12:24:35.076758 13984 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.076909 master-0 kubenswrapper[13984]: E0312 12:24:35.076863 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.576840039 +0000 UTC m=+7.774855581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.078144 master-0 kubenswrapper[13984]: E0312 12:24:35.077843 13984 configmap.go:193] Couldn't get configMap openshift-multus/default-cni-sysctl-allowlist: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.078144 master-0 kubenswrapper[13984]: E0312 12:24:35.077963 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist podName:10498208-0692-4533-b672-a7a2cfcdf1be nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.577940292 +0000 UTC m=+7.775955814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist") pod "multus-additional-cni-plugins-r86hc" (UID: "10498208-0692-4533-b672-a7a2cfcdf1be") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.078144 master-0 kubenswrapper[13984]: E0312 12:24:35.078099 13984 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.078469 master-0 kubenswrapper[13984]: E0312 12:24:35.078208 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config podName:632651f7-6641-49d8-9c48-7f6ea5846538 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.578183522 +0000 UTC m=+7.776199054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config") pod "cluster-autoscaler-operator-69576476f7-ph7gk" (UID: "632651f7-6641-49d8-9c48-7f6ea5846538") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.078771 master-0 kubenswrapper[13984]: E0312 12:24:35.078712 13984 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.078901 master-0 kubenswrapper[13984]: E0312 12:24:35.078805 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.578782775 +0000 UTC m=+7.776798327 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.079390 master-0 kubenswrapper[13984]: E0312 12:24:35.079265 13984 configmap.go:193] Couldn't get configMap openshift-cluster-node-tuning-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.079390 master-0 kubenswrapper[13984]: E0312 12:24:35.079359 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.579340787 +0000 UTC m=+7.777356309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.082783 master-0 kubenswrapper[13984]: E0312 12:24:35.082404 13984 configmap.go:193] Couldn't get configMap openshift-cluster-version/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.082783 master-0 kubenswrapper[13984]: E0312 12:24:35.082552 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca podName:114b1d16-b37d-449c-84e3-3fb3f8b20eaa nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.582517172 +0000 UTC m=+7.780532704 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca" (UniqueName: "kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca") pod "cluster-version-operator-8c9c967c7-7dg9w" (UID: "114b1d16-b37d-449c-84e3-3fb3f8b20eaa") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.082783 master-0 kubenswrapper[13984]: E0312 12:24:35.082622 13984 configmap.go:193] Couldn't get configMap openshift-image-registry/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.082783 master-0 kubenswrapper[13984]: E0312 12:24:35.082688 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca podName:cfd178d7-f518-413b-95ab-ab6687be6e0f nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.582667308 +0000 UTC m=+7.780682880 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca") pod "cluster-image-registry-operator-86d6d77c7c-kcnf4" (UID: "cfd178d7-f518-413b-95ab-ab6687be6e0f") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.083168 master-0 kubenswrapper[13984]: E0312 12:24:35.082866 13984 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.083168 master-0 kubenswrapper[13984]: E0312 12:24:35.083042 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert podName:b9194868-75ce-4138-a9d4-ddd64660c529 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.583008321 +0000 UTC m=+7.781023853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-9vtjp" (UID: "b9194868-75ce-4138-a9d4-ddd64660c529") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.083168 master-0 kubenswrapper[13984]: E0312 12:24:35.083093 13984 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.083168 master-0 kubenswrapper[13984]: E0312 12:24:35.083159 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert podName:114b1d16-b37d-449c-84e3-3fb3f8b20eaa nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.583140116 +0000 UTC m=+7.781155688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert") pod "cluster-version-operator-8c9c967c7-7dg9w" (UID: "114b1d16-b37d-449c-84e3-3fb3f8b20eaa") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.089450 master-0 kubenswrapper[13984]: E0312 12:24:35.089212 13984 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.089450 master-0 kubenswrapper[13984]: E0312 12:24:35.089304 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert podName:9bc7dea3-1868-488c-a34b-288cde3acd35 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.589277987 +0000 UTC m=+7.787293529 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert") pod "olm-operator-d64cfc9db-sp7w9" (UID: "9bc7dea3-1868-488c-a34b-288cde3acd35") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.089450 master-0 kubenswrapper[13984]: E0312 12:24:35.089370 13984 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.089450 master-0 kubenswrapper[13984]: E0312 12:24:35.089419 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.589405642 +0000 UTC m=+7.787421164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.089985 master-0 kubenswrapper[13984]: E0312 12:24:35.089879 13984 secret.go:189] Couldn't get secret openshift-kube-controller-manager-operator/kube-controller-manager-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.089985 master-0 kubenswrapper[13984]: E0312 12:24:35.089950 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert podName:6571f5e5-07ee-4e6c-a8ad-277bc52e35ee nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.589932082 +0000 UTC m=+7.787947614 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert") pod "kube-controller-manager-operator-86d7cdfdfb-d4htx" (UID: "6571f5e5-07ee-4e6c-a8ad-277bc52e35ee") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.091336 master-0 kubenswrapper[13984]: E0312 12:24:35.091105 13984 secret.go:189] Couldn't get secret openshift-cluster-olm-operator/cluster-olm-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.091336 master-0 kubenswrapper[13984]: E0312 12:24:35.091141 13984 configmap.go:193] Couldn't get configMap openshift-marketplace/marketplace-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.091336 master-0 kubenswrapper[13984]: E0312 12:24:35.091205 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert podName:3ebe5b05-95d6-43ff-95a4-0c9c7ce70326 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.591187462 +0000 UTC m=+7.789202984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-olm-operator-serving-cert" (UniqueName: "kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert") pod "cluster-olm-operator-77899cf6d-68k5k" (UID: "3ebe5b05-95d6-43ff-95a4-0c9c7ce70326") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.091336 master-0 kubenswrapper[13984]: E0312 12:24:35.091238 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca podName:3c02552c-a477-4c6c-8a45-2fdc758c084b nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.591218013 +0000 UTC m=+7.789233565 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-trusted-ca" (UniqueName: "kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca") pod "marketplace-operator-64bf9778cb-rgstx" (UID: "3c02552c-a477-4c6c-8a45-2fdc758c084b") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.091336 master-0 kubenswrapper[13984]: E0312 12:24:35.091238 13984 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.091336 master-0 kubenswrapper[13984]: E0312 12:24:35.091302 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls podName:f19c3c89-8d32-4394-bd86-e5ef7734c42b nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.591275175 +0000 UTC m=+7.789290697 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-k9t2m" (UID: "f19c3c89-8d32-4394-bd86-e5ef7734c42b") : failed to sync secret cache: timed out waiting for the condition Mar 12 12:24:35.092453 master-0 kubenswrapper[13984]: E0312 12:24:35.092409 13984 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.092584 master-0 kubenswrapper[13984]: E0312 12:24:35.092505 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume podName:9d47f860-d64a-49b8-b404-a67cbc2faeb6 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:35.592470682 +0000 UTC m=+7.790486264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume") pod "dns-default-k8t84" (UID: "9d47f860-d64a-49b8-b404-a67cbc2faeb6") : failed to sync configmap cache: timed out waiting for the condition Mar 12 12:24:35.095785 master-0 kubenswrapper[13984]: I0312 12:24:35.095734 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 12 12:24:35.117557 master-0 kubenswrapper[13984]: I0312 12:24:35.116767 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:24:35.135537 master-0 kubenswrapper[13984]: I0312 12:24:35.135462 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 12:24:35.155613 master-0 kubenswrapper[13984]: I0312 12:24:35.155557 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 12:24:35.176819 master-0 kubenswrapper[13984]: I0312 12:24:35.176763 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 12:24:35.196133 master-0 kubenswrapper[13984]: I0312 12:24:35.196083 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 12:24:35.216969 master-0 kubenswrapper[13984]: I0312 12:24:35.216504 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 12:24:35.231670 master-0 kubenswrapper[13984]: I0312 12:24:35.231583 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:35.232634 master-0 kubenswrapper[13984]: I0312 12:24:35.231598 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:35.236644 master-0 kubenswrapper[13984]: I0312 12:24:35.236612 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 12:24:35.256062 master-0 kubenswrapper[13984]: I0312 12:24:35.255994 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 12:24:35.276712 master-0 kubenswrapper[13984]: I0312 12:24:35.276634 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 12:24:35.296345 master-0 kubenswrapper[13984]: I0312 12:24:35.296292 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 12:24:35.316620 master-0 kubenswrapper[13984]: I0312 12:24:35.316549 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:24:35.335803 master-0 kubenswrapper[13984]: I0312 12:24:35.335750 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 12:24:35.356261 master-0 kubenswrapper[13984]: I0312 12:24:35.356207 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 12:24:35.376809 master-0 kubenswrapper[13984]: I0312 12:24:35.376728 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 12:24:35.397729 master-0 kubenswrapper[13984]: I0312 12:24:35.397286 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 12:24:35.416275 master-0 kubenswrapper[13984]: I0312 12:24:35.416201 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 12:24:35.436142 master-0 kubenswrapper[13984]: I0312 12:24:35.436086 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 12:24:35.456306 master-0 kubenswrapper[13984]: I0312 12:24:35.456244 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 12:24:35.476368 master-0 kubenswrapper[13984]: I0312 12:24:35.476257 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 12:24:35.497407 master-0 kubenswrapper[13984]: I0312 12:24:35.497328 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 12:24:35.517287 master-0 kubenswrapper[13984]: I0312 12:24:35.517212 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 12:24:35.547391 master-0 kubenswrapper[13984]: I0312 12:24:35.547322 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 12:24:35.555906 master-0 kubenswrapper[13984]: I0312 12:24:35.555862 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 12:24:35.576980 master-0 kubenswrapper[13984]: I0312 12:24:35.576916 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 12:24:35.596177 master-0 kubenswrapper[13984]: I0312 12:24:35.596122 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 12:24:35.616899 master-0 kubenswrapper[13984]: I0312 12:24:35.616813 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 12:24:35.627055 master-0 kubenswrapper[13984]: I0312 12:24:35.626997 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:35.627055 master-0 kubenswrapper[13984]: I0312 12:24:35.627035 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:35.627219 master-0 kubenswrapper[13984]: I0312 12:24:35.627065 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:35.627219 master-0 kubenswrapper[13984]: I0312 12:24:35.627108 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:35.627219 master-0 kubenswrapper[13984]: I0312 12:24:35.627132 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:35.627219 master-0 kubenswrapper[13984]: I0312 12:24:35.627205 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627227 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627274 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627305 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627349 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627373 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627395 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627424 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:35.627468 master-0 kubenswrapper[13984]: I0312 12:24:35.627464 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:35.627995 master-0 kubenswrapper[13984]: I0312 12:24:35.627521 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:35.627995 master-0 kubenswrapper[13984]: I0312 12:24:35.627543 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:35.627995 master-0 kubenswrapper[13984]: I0312 12:24:35.627562 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:35.627995 master-0 kubenswrapper[13984]: I0312 12:24:35.627611 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:35.627995 master-0 kubenswrapper[13984]: I0312 12:24:35.627640 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:35.627995 master-0 kubenswrapper[13984]: I0312 12:24:35.627667 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:35.628339 master-0 kubenswrapper[13984]: I0312 12:24:35.628133 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/632651f7-6641-49d8-9c48-7f6ea5846538-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:35.628399 master-0 kubenswrapper[13984]: I0312 12:24:35.628378 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:35.628460 master-0 kubenswrapper[13984]: I0312 12:24:35.628408 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:35.628460 master-0 kubenswrapper[13984]: I0312 12:24:35.628434 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:35.628608 master-0 kubenswrapper[13984]: I0312 12:24:35.628549 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:35.628674 master-0 kubenswrapper[13984]: I0312 12:24:35.628609 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/cfd178d7-f518-413b-95ab-ab6687be6e0f-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:35.628733 master-0 kubenswrapper[13984]: I0312 12:24:35.628614 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:35.629051 master-0 kubenswrapper[13984]: I0312 12:24:35.628943 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d47f860-d64a-49b8-b404-a67cbc2faeb6-config-volume\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:35.629051 master-0 kubenswrapper[13984]: I0312 12:24:35.628967 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/f19c3c89-8d32-4394-bd86-e5ef7734c42b-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:35.629051 master-0 kubenswrapper[13984]: I0312 12:24:35.628994 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61ab511b-72e9-4fb9-b5de-770f49514369-metrics-tls\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:35.629051 master-0 kubenswrapper[13984]: I0312 12:24:35.628978 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:35.629348 master-0 kubenswrapper[13984]: I0312 12:24:35.629130 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea80247e-b4dd-45dc-8255-6e68508c8480-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:35.629348 master-0 kubenswrapper[13984]: I0312 12:24:35.629337 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cfd178d7-f518-413b-95ab-ab6687be6e0f-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:35.629471 master-0 kubenswrapper[13984]: I0312 12:24:35.629427 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:35.629585 master-0 kubenswrapper[13984]: I0312 12:24:35.629533 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9bc7dea3-1868-488c-a34b-288cde3acd35-srv-cert\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:35.629921 master-0 kubenswrapper[13984]: I0312 12:24:35.629881 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-serving-cert\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:35.629921 master-0 kubenswrapper[13984]: I0312 12:24:35.629899 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:35.636908 master-0 kubenswrapper[13984]: I0312 12:24:35.636880 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:24:35.656165 master-0 kubenswrapper[13984]: I0312 12:24:35.656106 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 12 12:24:35.677016 master-0 kubenswrapper[13984]: I0312 12:24:35.676943 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 12:24:35.697494 master-0 kubenswrapper[13984]: I0312 12:24:35.697408 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 12 12:24:35.701119 master-0 kubenswrapper[13984]: I0312 12:24:35.701006 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:35.716717 master-0 kubenswrapper[13984]: I0312 12:24:35.716634 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 12:24:35.719989 master-0 kubenswrapper[13984]: I0312 12:24:35.719940 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-service-ca\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:35.736287 master-0 kubenswrapper[13984]: I0312 12:24:35.736169 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 12 12:24:35.739716 master-0 kubenswrapper[13984]: I0312 12:24:35.739658 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-whereabouts-configmap\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:35.756371 master-0 kubenswrapper[13984]: I0312 12:24:35.756282 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 12:24:35.759396 master-0 kubenswrapper[13984]: I0312 12:24:35.759322 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea80247e-b4dd-45dc-8255-6e68508c8480-config\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:35.776721 master-0 kubenswrapper[13984]: I0312 12:24:35.776664 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 12 12:24:35.796726 master-0 kubenswrapper[13984]: I0312 12:24:35.796671 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 12:24:35.798800 master-0 kubenswrapper[13984]: I0312 12:24:35.798753 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/10498208-0692-4533-b672-a7a2cfcdf1be-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:35.824919 master-0 kubenswrapper[13984]: I0312 12:24:35.824853 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 12:24:35.829096 master-0 kubenswrapper[13984]: I0312 12:24:35.829031 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:35.835910 master-0 kubenswrapper[13984]: I0312 12:24:35.835860 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 12:24:35.838995 master-0 kubenswrapper[13984]: I0312 12:24:35.838943 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3c02552c-a477-4c6c-8a45-2fdc758c084b-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:35.857190 master-0 kubenswrapper[13984]: I0312 12:24:35.857109 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 12:24:35.876603 master-0 kubenswrapper[13984]: I0312 12:24:35.876365 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 12:24:35.896057 master-0 kubenswrapper[13984]: I0312 12:24:35.895985 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 12 12:24:35.900061 master-0 kubenswrapper[13984]: I0312 12:24:35.900009 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:35.917244 master-0 kubenswrapper[13984]: I0312 12:24:35.917181 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 12 12:24:35.919848 master-0 kubenswrapper[13984]: I0312 12:24:35.919746 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/b9194868-75ce-4138-a9d4-ddd64660c529-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:35.936042 master-0 kubenswrapper[13984]: I0312 12:24:35.935988 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 12:24:35.939597 master-0 kubenswrapper[13984]: I0312 12:24:35.939543 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/d961a5f0-84b7-47d7-846b-238475947121-srv-cert\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:35.964416 master-0 kubenswrapper[13984]: I0312 12:24:35.964351 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 12 12:24:35.969645 master-0 kubenswrapper[13984]: I0312 12:24:35.969595 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b9194868-75ce-4138-a9d4-ddd64660c529-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:35.974565 master-0 kubenswrapper[13984]: I0312 12:24:35.974352 13984 request.go:700] Waited for 2.006343772s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cluster-samples-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 12 12:24:35.976631 master-0 kubenswrapper[13984]: I0312 12:24:35.976417 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 12:24:35.995601 master-0 kubenswrapper[13984]: I0312 12:24:35.995419 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-8fdxz" Mar 12 12:24:36.049977 master-0 kubenswrapper[13984]: E0312 12:24:36.049888 13984 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="2.071s" Mar 12 12:24:36.049977 master-0 kubenswrapper[13984]: I0312 12:24:36.049963 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 12 12:24:36.049977 master-0 kubenswrapper[13984]: I0312 12:24:36.049979 13984 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="62ebf93a-f9a1-4681-b086-18be545a70fd" Mar 12 12:24:36.050349 master-0 kubenswrapper[13984]: I0312 12:24:36.050002 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"a9d9b5f96bde28a030172fa8b8562f0ad2738118cc137cd6bc087cf4fbc7972f"} Mar 12 12:24:36.059855 master-0 kubenswrapper[13984]: I0312 12:24:36.059793 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 12 12:24:36.081844 master-0 kubenswrapper[13984]: I0312 12:24:36.081779 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-bound-sa-token\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:36.089454 master-0 kubenswrapper[13984]: I0312 12:24:36.089402 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dhfq\" (UniqueName: \"kubernetes.io/projected/a346ac54-02fe-417f-a49d-038e45b13a1d-kube-api-access-9dhfq\") pod \"authentication-operator-7c6989d6c4-98xjv\" (UID: \"a346ac54-02fe-417f-a49d-038e45b13a1d\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-98xjv" Mar 12 12:24:36.111075 master-0 kubenswrapper[13984]: I0312 12:24:36.111013 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5xlx\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-kube-api-access-r5xlx\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:36.131634 master-0 kubenswrapper[13984]: I0312 12:24:36.131561 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx2c4\" (UniqueName: \"kubernetes.io/projected/3ebe5b05-95d6-43ff-95a4-0c9c7ce70326-kube-api-access-xx2c4\") pod \"cluster-olm-operator-77899cf6d-68k5k\" (UID: \"3ebe5b05-95d6-43ff-95a4-0c9c7ce70326\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-68k5k" Mar 12 12:24:36.153822 master-0 kubenswrapper[13984]: I0312 12:24:36.153749 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86k82\" (UniqueName: \"kubernetes.io/projected/632651f7-6641-49d8-9c48-7f6ea5846538-kube-api-access-86k82\") pod \"cluster-autoscaler-operator-69576476f7-ph7gk\" (UID: \"632651f7-6641-49d8-9c48-7f6ea5846538\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" Mar 12 12:24:36.180877 master-0 kubenswrapper[13984]: I0312 12:24:36.180811 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") pod \"controller-manager-6f87d47d96-c24tv\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:36.199052 master-0 kubenswrapper[13984]: I0312 12:24:36.198984 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8a121d0d-d201-446b-97a1-e2414e599f4a-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-mpxz4\" (UID: \"8a121d0d-d201-446b-97a1-e2414e599f4a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-mpxz4" Mar 12 12:24:36.210540 master-0 kubenswrapper[13984]: I0312 12:24:36.210450 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94src\" (UniqueName: \"kubernetes.io/projected/c7d2a100-a24a-4ae6-bd8e-4530163a3ffe-kube-api-access-94src\") pod \"cluster-baremetal-operator-5cdb4c5598-pb97p\" (UID: \"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" Mar 12 12:24:36.230617 master-0 kubenswrapper[13984]: I0312 12:24:36.230540 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhwbb\" (UniqueName: \"kubernetes.io/projected/ed5a074c-e194-4b16-a4c9-0d82830bf7ca-kube-api-access-qhwbb\") pod \"tuned-9zrvj\" (UID: \"ed5a074c-e194-4b16-a4c9-0d82830bf7ca\") " pod="openshift-cluster-node-tuning-operator/tuned-9zrvj" Mar 12 12:24:36.248877 master-0 kubenswrapper[13984]: I0312 12:24:36.248766 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcrz\" (UniqueName: \"kubernetes.io/projected/b5890f0c-cebe-4788-89f7-27568d875741-kube-api-access-fqcrz\") pod \"openshift-controller-manager-operator-8565d84698-cg7rd\" (UID: \"b5890f0c-cebe-4788-89f7-27568d875741\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-cg7rd" Mar 12 12:24:36.270362 master-0 kubenswrapper[13984]: I0312 12:24:36.270302 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bx48\" (UniqueName: \"kubernetes.io/projected/5a012d0b-d1a8-4cd3-8b91-b346d0445f24-kube-api-access-9bx48\") pod \"csi-snapshot-controller-operator-5685fbc7d-vmj4h\" (UID: \"5a012d0b-d1a8-4cd3-8b91-b346d0445f24\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" Mar 12 12:24:36.302766 master-0 kubenswrapper[13984]: I0312 12:24:36.302708 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrqx7\" (UniqueName: \"kubernetes.io/projected/666857a1-0ddf-4b48-91f4-44cce154d1b1-kube-api-access-vrqx7\") pod \"multus-hb48g\" (UID: \"666857a1-0ddf-4b48-91f4-44cce154d1b1\") " pod="openshift-multus/multus-hb48g" Mar 12 12:24:36.313454 master-0 kubenswrapper[13984]: I0312 12:24:36.313406 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns72p\" (UniqueName: \"kubernetes.io/projected/99a11fe6-48a1-439e-b788-158dbe267dcd-kube-api-access-ns72p\") pod \"machine-config-operator-fdb5c78b5-lxvgd\" (UID: \"99a11fe6-48a1-439e-b788-158dbe267dcd\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" Mar 12 12:24:36.327651 master-0 kubenswrapper[13984]: I0312 12:24:36.327586 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxrnf\" (UniqueName: \"kubernetes.io/projected/022dd526-0ea5-4224-9d2e-778ed4ef8a56-kube-api-access-pxrnf\") pod \"catalogd-controller-manager-7f8b8b6f4c-pqph7\" (UID: \"022dd526-0ea5-4224-9d2e-778ed4ef8a56\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:36.350514 master-0 kubenswrapper[13984]: I0312 12:24:36.350424 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdvf6\" (UniqueName: \"kubernetes.io/projected/d961a5f0-84b7-47d7-846b-238475947121-kube-api-access-zdvf6\") pod \"catalog-operator-7d9c49f57b-nwk7v\" (UID: \"d961a5f0-84b7-47d7-846b-238475947121\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:36.357464 master-0 kubenswrapper[13984]: I0312 12:24:36.357412 13984 scope.go:117] "RemoveContainer" containerID="06f4c15730d5d23bfb91ec1bbc7bab14f9a3a3ae32c22b935d487a1f88576da3" Mar 12 12:24:36.359111 master-0 kubenswrapper[13984]: I0312 12:24:36.359059 13984 scope.go:117] "RemoveContainer" containerID="c119461954016ba56cd4650cbf6e8e2b03da1364b0a52ff3ba4437048b8fac29" Mar 12 12:24:36.368761 master-0 kubenswrapper[13984]: I0312 12:24:36.368685 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvrpf\" (UniqueName: \"kubernetes.io/projected/7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef-kube-api-access-wvrpf\") pod \"csi-snapshot-controller-7577d6f48-kf7kw\" (UID: \"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" Mar 12 12:24:36.393920 master-0 kubenswrapper[13984]: I0312 12:24:36.393852 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq5c7\" (UniqueName: \"kubernetes.io/projected/e64bc838-280e-4231-9732-1adb69fed0bc-kube-api-access-tq5c7\") pod \"network-metrics-daemon-4m9jh\" (UID: \"e64bc838-280e-4231-9732-1adb69fed0bc\") " pod="openshift-multus/network-metrics-daemon-4m9jh" Mar 12 12:24:36.407945 master-0 kubenswrapper[13984]: I0312 12:24:36.407626 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/114b1d16-b37d-449c-84e3-3fb3f8b20eaa-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-7dg9w\" (UID: \"114b1d16-b37d-449c-84e3-3fb3f8b20eaa\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" Mar 12 12:24:36.440987 master-0 kubenswrapper[13984]: I0312 12:24:36.440920 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf74f\" (UniqueName: \"kubernetes.io/projected/580bafd6-af8c-4961-b959-b736a180e309-kube-api-access-lf74f\") pod \"migrator-57ccdf9b5-4rjrp\" (UID: \"580bafd6-af8c-4961-b959-b736a180e309\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-4rjrp" Mar 12 12:24:36.448825 master-0 kubenswrapper[13984]: I0312 12:24:36.448375 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54547\" (UniqueName: \"kubernetes.io/projected/36852fda-6aee-4a36-8724-537f1260c4c8-kube-api-access-54547\") pod \"node-resolver-72w9q\" (UID: \"36852fda-6aee-4a36-8724-537f1260c4c8\") " pod="openshift-dns/node-resolver-72w9q" Mar 12 12:24:36.469409 master-0 kubenswrapper[13984]: I0312 12:24:36.468341 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wg27g\" (UniqueName: \"kubernetes.io/projected/51d58450-50bb-4da0-b1f6-4135fbabd856-kube-api-access-wg27g\") pod \"network-node-identity-rzmhl\" (UID: \"51d58450-50bb-4da0-b1f6-4135fbabd856\") " pod="openshift-network-node-identity/network-node-identity-rzmhl" Mar 12 12:24:36.489351 master-0 kubenswrapper[13984]: I0312 12:24:36.489311 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8mvz\" (UniqueName: \"kubernetes.io/projected/a154f648-b96d-449e-b0f5-ba32266000c2-kube-api-access-x8mvz\") pod \"openshift-config-operator-64488f9d78-fg5mg\" (UID: \"a154f648-b96d-449e-b0f5-ba32266000c2\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:36.514534 master-0 kubenswrapper[13984]: I0312 12:24:36.514272 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cfd178d7-f518-413b-95ab-ab6687be6e0f-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kcnf4\" (UID: \"cfd178d7-f518-413b-95ab-ab6687be6e0f\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" Mar 12 12:24:36.530073 master-0 kubenswrapper[13984]: I0312 12:24:36.529047 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdwfl\" (UniqueName: \"kubernetes.io/projected/10498208-0692-4533-b672-a7a2cfcdf1be-kube-api-access-xdwfl\") pod \"multus-additional-cni-plugins-r86hc\" (UID: \"10498208-0692-4533-b672-a7a2cfcdf1be\") " pod="openshift-multus/multus-additional-cni-plugins-r86hc" Mar 12 12:24:36.548121 master-0 kubenswrapper[13984]: I0312 12:24:36.548086 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5jgq\" (UniqueName: \"kubernetes.io/projected/81cb0504-9455-4398-aed1-5cc6790f292e-kube-api-access-j5jgq\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-pvjft\" (UID: \"81cb0504-9455-4398-aed1-5cc6790f292e\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" Mar 12 12:24:36.567036 master-0 kubenswrapper[13984]: I0312 12:24:36.566994 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv9kn\" (UniqueName: \"kubernetes.io/projected/2bab9dba-235f-467c-9224-634cca9acbd2-kube-api-access-cv9kn\") pod \"machine-api-operator-84bf6db4f9-nq8zw\" (UID: \"2bab9dba-235f-467c-9224-634cca9acbd2\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" Mar 12 12:24:36.586568 master-0 kubenswrapper[13984]: I0312 12:24:36.586523 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wj8x\" (UniqueName: \"kubernetes.io/projected/aa8ddfdd-7f2d-4fd4-b666-1497dee752df-kube-api-access-6wj8x\") pod \"operator-controller-controller-manager-6598bfb6c4-2849p\" (UID: \"aa8ddfdd-7f2d-4fd4-b666-1497dee752df\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:36.606849 master-0 kubenswrapper[13984]: I0312 12:24:36.606786 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l57v\" (UniqueName: \"kubernetes.io/projected/ae2269d7-f11f-46d1-95e7-f89a70ee1152-kube-api-access-6l57v\") pod \"cluster-monitoring-operator-674cbfbd9d-tztzr\" (UID: \"ae2269d7-f11f-46d1-95e7-f89a70ee1152\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-tztzr" Mar 12 12:24:36.627234 master-0 kubenswrapper[13984]: I0312 12:24:36.627174 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jq9d5\" (UniqueName: \"kubernetes.io/projected/d1d16bbc-778b-4fc1-abb2-b43e79a7c532-kube-api-access-jq9d5\") pod \"package-server-manager-854648ff6d-tcc85\" (UID: \"d1d16bbc-778b-4fc1-abb2-b43e79a7c532\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:36.648064 master-0 kubenswrapper[13984]: I0312 12:24:36.648021 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4r7v\" (UniqueName: \"kubernetes.io/projected/b9194868-75ce-4138-a9d4-ddd64660c529-kube-api-access-s4r7v\") pod \"cluster-node-tuning-operator-66c7586884-9vtjp\" (UID: \"b9194868-75ce-4138-a9d4-ddd64660c529\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" Mar 12 12:24:36.668299 master-0 kubenswrapper[13984]: I0312 12:24:36.668030 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lnbq\" (UniqueName: \"kubernetes.io/projected/54612733-158f-4a92-a1bf-f4a8d653ffaf-kube-api-access-9lnbq\") pod \"iptables-alerter-xqmw9\" (UID: \"54612733-158f-4a92-a1bf-f4a8d653ffaf\") " pod="openshift-network-operator/iptables-alerter-xqmw9" Mar 12 12:24:36.694579 master-0 kubenswrapper[13984]: I0312 12:24:36.694529 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7tct\" (UniqueName: \"kubernetes.io/projected/15bf86d9-62b3-4af8-b6f6-23131d712332-kube-api-access-g7tct\") pod \"service-ca-84bfdbbb7f-gxx99\" (UID: \"15bf86d9-62b3-4af8-b6f6-23131d712332\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" Mar 12 12:24:36.718184 master-0 kubenswrapper[13984]: I0312 12:24:36.718124 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbztv\" (UniqueName: \"kubernetes.io/projected/ea80247e-b4dd-45dc-8255-6e68508c8480-kube-api-access-xbztv\") pod \"openshift-apiserver-operator-799b6db4d7-gc2gv\" (UID: \"ea80247e-b4dd-45dc-8255-6e68508c8480\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-gc2gv" Mar 12 12:24:36.726870 master-0 kubenswrapper[13984]: I0312 12:24:36.726822 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") pod \"route-controller-manager-7665b44c8d-2lgnf\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:36.746071 master-0 kubenswrapper[13984]: I0312 12:24:36.746028 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njx9l\" (UniqueName: \"kubernetes.io/projected/9d47f860-d64a-49b8-b404-a67cbc2faeb6-kube-api-access-njx9l\") pod \"dns-default-k8t84\" (UID: \"9d47f860-d64a-49b8-b404-a67cbc2faeb6\") " pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:36.768756 master-0 kubenswrapper[13984]: I0312 12:24:36.768632 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwfct\" (UniqueName: \"kubernetes.io/projected/ab087440-bdf2-4e2f-9a5a-434d50a2329a-kube-api-access-pwfct\") pod \"etcd-operator-5884b9cd56-7nb6b\" (UID: \"ab087440-bdf2-4e2f-9a5a-434d50a2329a\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" Mar 12 12:24:36.787709 master-0 kubenswrapper[13984]: I0312 12:24:36.787630 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2f6r\" (UniqueName: \"kubernetes.io/projected/2f3a291a-d9af-4e0f-a307-8928e4dc523d-kube-api-access-b2f6r\") pod \"ovnkube-control-plane-66b55d57d-mlfvv\" (UID: \"2f3a291a-d9af-4e0f-a307-8928e4dc523d\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" Mar 12 12:24:36.808134 master-0 kubenswrapper[13984]: I0312 12:24:36.808053 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv9fk\" (UniqueName: \"kubernetes.io/projected/61ab511b-72e9-4fb9-b5de-770f49514369-kube-api-access-kv9fk\") pod \"network-operator-7c649bf6d4-rbb5m\" (UID: \"61ab511b-72e9-4fb9-b5de-770f49514369\") " pod="openshift-network-operator/network-operator-7c649bf6d4-rbb5m" Mar 12 12:24:36.829807 master-0 kubenswrapper[13984]: I0312 12:24:36.829752 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"multus-admission-controller-8d675b596-xpzn2\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:24:36.862163 master-0 kubenswrapper[13984]: I0312 12:24:36.862079 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p97xk\" (UniqueName: \"kubernetes.io/projected/3c02552c-a477-4c6c-8a45-2fdc758c084b-kube-api-access-p97xk\") pod \"marketplace-operator-64bf9778cb-rgstx\" (UID: \"3c02552c-a477-4c6c-8a45-2fdc758c084b\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:36.874430 master-0 kubenswrapper[13984]: I0312 12:24:36.874369 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9gmt\" (UniqueName: \"kubernetes.io/projected/0aeeef2a-f9df-4f87-b985-bd1da94c76c3-kube-api-access-m9gmt\") pod \"kube-storage-version-migrator-operator-7f65c457f5-qpd6h\" (UID: \"0aeeef2a-f9df-4f87-b985-bd1da94c76c3\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" Mar 12 12:24:36.891646 master-0 kubenswrapper[13984]: I0312 12:24:36.891583 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4nbb\" (UniqueName: \"kubernetes.io/projected/68c57a64-f30c-4caf-89ef-08bd0d36833e-kube-api-access-x4nbb\") pod \"insights-operator-8f89dfddd-4vzl8\" (UID: \"68c57a64-f30c-4caf-89ef-08bd0d36833e\") " pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" Mar 12 12:24:36.918206 master-0 kubenswrapper[13984]: I0312 12:24:36.918086 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9b960fe2-d59e-4ee1-bd9d-455b46753cb9-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-2kkmf\" (UID: \"9b960fe2-d59e-4ee1-bd9d-455b46753cb9\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-2kkmf" Mar 12 12:24:36.939925 master-0 kubenswrapper[13984]: I0312 12:24:36.939862 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcb5s\" (UniqueName: \"kubernetes.io/projected/720101f1-0833-45af-a5b7-4910ece2a589-kube-api-access-dcb5s\") pod \"apiserver-56c75bf4c7-5rbw4\" (UID: \"720101f1-0833-45af-a5b7-4910ece2a589\") " pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:36.954994 master-0 kubenswrapper[13984]: I0312 12:24:36.954920 13984 scope.go:117] "RemoveContainer" containerID="66efbdc582547fe0aac5943aa60889d7bd9e3cd56c005e2ffa377552f8953df8" Mar 12 12:24:36.960694 master-0 kubenswrapper[13984]: I0312 12:24:36.960654 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s28gq\" (UniqueName: \"kubernetes.io/projected/f19c3c89-8d32-4394-bd86-e5ef7734c42b-kube-api-access-s28gq\") pod \"cluster-samples-operator-664cb58b85-k9t2m\" (UID: \"f19c3c89-8d32-4394-bd86-e5ef7734c42b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-k9t2m" Mar 12 12:24:36.969575 master-0 kubenswrapper[13984]: I0312 12:24:36.969459 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmlzw\" (UniqueName: \"kubernetes.io/projected/9bc7dea3-1868-488c-a34b-288cde3acd35-kube-api-access-xmlzw\") pod \"olm-operator-d64cfc9db-sp7w9\" (UID: \"9bc7dea3-1868-488c-a34b-288cde3acd35\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:36.974903 master-0 kubenswrapper[13984]: I0312 12:24:36.974849 13984 request.go:700] Waited for 2.885789108s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ovn-kubernetes/serviceaccounts/ovn-kubernetes-node/token Mar 12 12:24:36.989033 master-0 kubenswrapper[13984]: I0312 12:24:36.988937 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd645\" (UniqueName: \"kubernetes.io/projected/f04121eb-5c7b-42cd-a2e2-26cf1c67593d-kube-api-access-gd645\") pod \"ovnkube-node-l5d2w\" (UID: \"f04121eb-5c7b-42cd-a2e2-26cf1c67593d\") " pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:37.020398 master-0 kubenswrapper[13984]: I0312 12:24:37.020257 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zct7x\" (UniqueName: \"kubernetes.io/projected/f3b704e7-1291-4645-8a0d-2a937829d7ac-kube-api-access-zct7x\") pod \"cluster-storage-operator-6fbfc8dc8f-5q4fw\" (UID: \"f3b704e7-1291-4645-8a0d-2a937829d7ac\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" Mar 12 12:24:37.043533 master-0 kubenswrapper[13984]: I0312 12:24:37.041645 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6flvz\" (UniqueName: \"kubernetes.io/projected/f3f295ac-7bc7-43b7-bd30-db82e7f16cd7-kube-api-access-6flvz\") pod \"dns-operator-589895fbb7-l8x6p\" (UID: \"f3f295ac-7bc7-43b7-bd30-db82e7f16cd7\") " pod="openshift-dns-operator/dns-operator-589895fbb7-l8x6p" Mar 12 12:24:37.053408 master-0 kubenswrapper[13984]: I0312 12:24:37.053303 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdfnh\" (UniqueName: \"kubernetes.io/projected/c62edaec-38e2-4b73-8bb5-c776abfb310f-kube-api-access-cdfnh\") pod \"control-plane-machine-set-operator-6686554ddc-dnxx4\" (UID: \"c62edaec-38e2-4b73-8bb5-c776abfb310f\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" Mar 12 12:24:37.069554 master-0 kubenswrapper[13984]: I0312 12:24:37.069425 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd4jz\" (UniqueName: \"kubernetes.io/projected/02d5a507-4409-44b4-98bc-1751cdcc6c6a-kube-api-access-jd4jz\") pod \"cloud-credential-operator-55d85b7b47-gz7ll\" (UID: \"02d5a507-4409-44b4-98bc-1751cdcc6c6a\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-gz7ll" Mar 12 12:24:37.100716 master-0 kubenswrapper[13984]: I0312 12:24:37.100657 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxt29\" (UniqueName: \"kubernetes.io/projected/269d77d9-815e-4324-8827-1ce429063ed1-kube-api-access-jxt29\") pod \"network-check-target-dfz7x\" (UID: \"269d77d9-815e-4324-8827-1ce429063ed1\") " pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:24:37.119662 master-0 kubenswrapper[13984]: I0312 12:24:37.119582 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ft5sd\" (UniqueName: \"kubernetes.io/projected/a22189f2-3f35-4ea6-9892-39a1b46637e2-kube-api-access-ft5sd\") pod \"ingress-operator-677db989d6-vpss8\" (UID: \"a22189f2-3f35-4ea6-9892-39a1b46637e2\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-vpss8" Mar 12 12:24:37.127680 master-0 kubenswrapper[13984]: I0312 12:24:37.127626 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgxbv\" (UniqueName: \"kubernetes.io/projected/c873b656-d2aa-4d0e-aa22-9f8d35186473-kube-api-access-kgxbv\") pod \"apiserver-7849849f76-86f2r\" (UID: \"c873b656-d2aa-4d0e-aa22-9f8d35186473\") " pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:37.149292 master-0 kubenswrapper[13984]: I0312 12:24:37.149239 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svpvs\" (UniqueName: \"kubernetes.io/projected/55bf535c-93ab-4870-a9d2-c02496d71ef0-kube-api-access-svpvs\") pod \"service-ca-operator-69b6fc6b88-s2gsp\" (UID: \"55bf535c-93ab-4870-a9d2-c02496d71ef0\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-s2gsp" Mar 12 12:24:37.169864 master-0 kubenswrapper[13984]: I0312 12:24:37.169814 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb249\" (UniqueName: \"kubernetes.io/projected/6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc-kube-api-access-tb249\") pod \"machine-approver-754bdc9f9d-2hk7d\" (UID: \"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" Mar 12 12:24:37.188909 master-0 kubenswrapper[13984]: I0312 12:24:37.188866 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6571f5e5-07ee-4e6c-a8ad-277bc52e35ee-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-d4htx\" (UID: \"6571f5e5-07ee-4e6c-a8ad-277bc52e35ee\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-d4htx" Mar 12 12:24:37.211403 master-0 kubenswrapper[13984]: E0312 12:24:37.211309 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:37.211403 master-0 kubenswrapper[13984]: E0312 12:24:37.211360 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:37.211764 master-0 kubenswrapper[13984]: E0312 12:24:37.211453 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:24:37.711425727 +0000 UTC m=+9.909441239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:37.227598 master-0 kubenswrapper[13984]: E0312 12:24:37.227530 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:37.227598 master-0 kubenswrapper[13984]: E0312 12:24:37.227574 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:37.227809 master-0 kubenswrapper[13984]: E0312 12:24:37.227646 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:37.727622131 +0000 UTC m=+9.925637613 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:37.242460 master-0 kubenswrapper[13984]: I0312 12:24:37.242421 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/2.log" Mar 12 12:24:37.243264 master-0 kubenswrapper[13984]: I0312 12:24:37.243222 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/1.log" Mar 12 12:24:37.243851 master-0 kubenswrapper[13984]: I0312 12:24:37.243818 13984 generic.go:334] "Generic (PLEG): container finished" podID="632651f7-6641-49d8-9c48-7f6ea5846538" containerID="da81346b3001ee955f12da9dabc7c3b2591cdb7d6466508682550d487c8ccccc" exitCode=255 Mar 12 12:24:37.245787 master-0 kubenswrapper[13984]: I0312 12:24:37.245758 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/2.log" Mar 12 12:24:37.246201 master-0 kubenswrapper[13984]: I0312 12:24:37.246179 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/1.log" Mar 12 12:24:37.246528 master-0 kubenswrapper[13984]: I0312 12:24:37.246506 13984 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="c7a8cac38562711df5097b3495b3fcbf36ebc31ee34e5cfed0a5c587cefecb36" exitCode=1 Mar 12 12:24:37.247622 master-0 kubenswrapper[13984]: E0312 12:24:37.247579 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 12 12:24:37.262313 master-0 kubenswrapper[13984]: E0312 12:24:37.262246 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:37.322880 master-0 kubenswrapper[13984]: E0312 12:24:37.322763 13984 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="1.273s" Mar 12 12:24:37.322880 master-0 kubenswrapper[13984]: I0312 12:24:37.322826 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 12 12:24:37.322880 master-0 kubenswrapper[13984]: I0312 12:24:37.322869 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 12 12:24:37.323128 master-0 kubenswrapper[13984]: I0312 12:24:37.322887 13984 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="62ebf93a-f9a1-4681-b086-18be545a70fd" Mar 12 12:24:37.323128 master-0 kubenswrapper[13984]: I0312 12:24:37.322913 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerDied","Data":"da81346b3001ee955f12da9dabc7c3b2591cdb7d6466508682550d487c8ccccc"} Mar 12 12:24:37.323128 master-0 kubenswrapper[13984]: I0312 12:24:37.322986 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:37.323128 master-0 kubenswrapper[13984]: I0312 12:24:37.323007 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerDied","Data":"c7a8cac38562711df5097b3495b3fcbf36ebc31ee34e5cfed0a5c587cefecb36"} Mar 12 12:24:37.323280 master-0 kubenswrapper[13984]: I0312 12:24:37.323138 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 12 12:24:37.323280 master-0 kubenswrapper[13984]: I0312 12:24:37.323161 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:37.323280 master-0 kubenswrapper[13984]: I0312 12:24:37.323222 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:37.323280 master-0 kubenswrapper[13984]: I0312 12:24:37.323239 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-4vzl8" event={"ID":"68c57a64-f30c-4caf-89ef-08bd0d36833e","Type":"ContainerStarted","Data":"07271945950d130b6b623e1737e6ff19545756721c6cf97080d6b786f7773cb7"} Mar 12 12:24:37.323280 master-0 kubenswrapper[13984]: I0312 12:24:37.323276 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:24:37.323446 master-0 kubenswrapper[13984]: I0312 12:24:37.323347 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:37.323446 master-0 kubenswrapper[13984]: I0312 12:24:37.323419 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:37.323535 master-0 kubenswrapper[13984]: I0312 12:24:37.323460 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:37.323574 master-0 kubenswrapper[13984]: I0312 12:24:37.323545 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:24:37.323619 master-0 kubenswrapper[13984]: I0312 12:24:37.323583 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:37.323619 master-0 kubenswrapper[13984]: I0312 12:24:37.323612 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:37.323687 master-0 kubenswrapper[13984]: I0312 12:24:37.323645 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:37.323687 master-0 kubenswrapper[13984]: I0312 12:24:37.323678 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-k8t84" Mar 12 12:24:37.323765 master-0 kubenswrapper[13984]: I0312 12:24:37.323712 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:37.323765 master-0 kubenswrapper[13984]: I0312 12:24:37.323742 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:37.324957 master-0 kubenswrapper[13984]: I0312 12:24:37.324930 13984 scope.go:117] "RemoveContainer" containerID="da81346b3001ee955f12da9dabc7c3b2591cdb7d6466508682550d487c8ccccc" Mar 12 12:24:37.325186 master-0 kubenswrapper[13984]: E0312 12:24:37.325157 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-ph7gk_openshift-machine-api(632651f7-6641-49d8-9c48-7f6ea5846538)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" podUID="632651f7-6641-49d8-9c48-7f6ea5846538" Mar 12 12:24:37.325421 master-0 kubenswrapper[13984]: I0312 12:24:37.325397 13984 scope.go:117] "RemoveContainer" containerID="c119461954016ba56cd4650cbf6e8e2b03da1364b0a52ff3ba4437048b8fac29" Mar 12 12:24:37.326416 master-0 kubenswrapper[13984]: I0312 12:24:37.326391 13984 scope.go:117] "RemoveContainer" containerID="c7a8cac38562711df5097b3495b3fcbf36ebc31ee34e5cfed0a5c587cefecb36" Mar 12 12:24:37.326611 master-0 kubenswrapper[13984]: E0312 12:24:37.326584 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:24:37.342496 master-0 kubenswrapper[13984]: I0312 12:24:37.342436 13984 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 12 12:24:37.342846 master-0 kubenswrapper[13984]: I0312 12:24:37.342568 13984 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 12 12:24:37.356147 master-0 kubenswrapper[13984]: I0312 12:24:37.356112 13984 scope.go:117] "RemoveContainer" containerID="06f4c15730d5d23bfb91ec1bbc7bab14f9a3a3ae32c22b935d487a1f88576da3" Mar 12 12:24:37.533332 master-0 kubenswrapper[13984]: I0312 12:24:37.533230 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=18.533198037 podStartE2EDuration="18.533198037s" podCreationTimestamp="2026-03-12 12:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:24:37.531971509 +0000 UTC m=+9.729987001" watchObservedRunningTime="2026-03-12 12:24:37.533198037 +0000 UTC m=+9.731213559" Mar 12 12:24:37.568830 master-0 kubenswrapper[13984]: I0312 12:24:37.568736 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=18.568715049 podStartE2EDuration="18.568715049s" podCreationTimestamp="2026-03-12 12:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:24:37.568634006 +0000 UTC m=+9.766649538" watchObservedRunningTime="2026-03-12 12:24:37.568715049 +0000 UTC m=+9.766730551" Mar 12 12:24:37.765317 master-0 kubenswrapper[13984]: I0312 12:24:37.765257 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:37.765317 master-0 kubenswrapper[13984]: I0312 12:24:37.765319 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:37.765554 master-0 kubenswrapper[13984]: E0312 12:24:37.765528 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:37.765554 master-0 kubenswrapper[13984]: E0312 12:24:37.765553 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:37.765643 master-0 kubenswrapper[13984]: E0312 12:24:37.765604 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:38.765588525 +0000 UTC m=+10.963604027 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:37.766067 master-0 kubenswrapper[13984]: E0312 12:24:37.766033 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:37.766067 master-0 kubenswrapper[13984]: E0312 12:24:37.766059 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:37.766150 master-0 kubenswrapper[13984]: E0312 12:24:37.766090 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:24:38.766080094 +0000 UTC m=+10.964095606 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:38.198633 master-0 kubenswrapper[13984]: I0312 12:24:38.198578 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:38.258120 master-0 kubenswrapper[13984]: I0312 12:24:38.258067 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/2.log" Mar 12 12:24:38.260230 master-0 kubenswrapper[13984]: I0312 12:24:38.260204 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/2.log" Mar 12 12:24:38.260834 master-0 kubenswrapper[13984]: I0312 12:24:38.260807 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:38.780530 master-0 kubenswrapper[13984]: I0312 12:24:38.780439 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:38.780530 master-0 kubenswrapper[13984]: I0312 12:24:38.780538 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:38.781084 master-0 kubenswrapper[13984]: E0312 12:24:38.780696 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:38.781084 master-0 kubenswrapper[13984]: E0312 12:24:38.780741 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:38.781084 master-0 kubenswrapper[13984]: E0312 12:24:38.780782 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:38.781084 master-0 kubenswrapper[13984]: E0312 12:24:38.780810 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:38.781084 master-0 kubenswrapper[13984]: E0312 12:24:38.780815 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:24:40.780791793 +0000 UTC m=+12.978807295 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:38.781084 master-0 kubenswrapper[13984]: E0312 12:24:38.780990 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:40.780919608 +0000 UTC m=+12.978935160 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:39.266347 master-0 kubenswrapper[13984]: I0312 12:24:39.265963 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:40.090169 master-0 kubenswrapper[13984]: I0312 12:24:40.090089 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:40.090513 master-0 kubenswrapper[13984]: I0312 12:24:40.090297 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:40.433988 master-0 kubenswrapper[13984]: I0312 12:24:40.433857 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:40.434566 master-0 kubenswrapper[13984]: I0312 12:24:40.434065 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:40.440198 master-0 kubenswrapper[13984]: I0312 12:24:40.440133 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:40.484093 master-0 kubenswrapper[13984]: I0312 12:24:40.484007 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:24:40.488547 master-0 kubenswrapper[13984]: I0312 12:24:40.486832 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-dfz7x" Mar 12 12:24:40.805097 master-0 kubenswrapper[13984]: I0312 12:24:40.804980 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:40.807079 master-0 kubenswrapper[13984]: I0312 12:24:40.806998 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:40.807250 master-0 kubenswrapper[13984]: I0312 12:24:40.807095 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:40.807386 master-0 kubenswrapper[13984]: I0312 12:24:40.807334 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:40.807539 master-0 kubenswrapper[13984]: E0312 12:24:40.807423 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:40.807539 master-0 kubenswrapper[13984]: E0312 12:24:40.807461 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:40.807751 master-0 kubenswrapper[13984]: E0312 12:24:40.807581 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:44.807550134 +0000 UTC m=+17.005565666 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:40.808465 master-0 kubenswrapper[13984]: E0312 12:24:40.808408 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:40.808465 master-0 kubenswrapper[13984]: E0312 12:24:40.808457 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:40.808796 master-0 kubenswrapper[13984]: E0312 12:24:40.808560 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:24:44.808534693 +0000 UTC m=+17.006550225 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:40.810040 master-0 kubenswrapper[13984]: I0312 12:24:40.809762 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:24:40.813184 master-0 kubenswrapper[13984]: I0312 12:24:40.813041 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:24:41.119154 master-0 kubenswrapper[13984]: I0312 12:24:41.119005 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:41.130051 master-0 kubenswrapper[13984]: I0312 12:24:41.129973 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-nwk7v" Mar 12 12:24:41.144121 master-0 kubenswrapper[13984]: I0312 12:24:41.144025 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:41.183591 master-0 kubenswrapper[13984]: I0312 12:24:41.183521 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:41.277911 master-0 kubenswrapper[13984]: I0312 12:24:41.277812 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:41.277911 master-0 kubenswrapper[13984]: I0312 12:24:41.277870 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:41.278237 master-0 kubenswrapper[13984]: I0312 12:24:41.277929 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:41.284537 master-0 kubenswrapper[13984]: I0312 12:24:41.283824 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:41.331322 master-0 kubenswrapper[13984]: I0312 12:24:41.331268 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:41.337544 master-0 kubenswrapper[13984]: I0312 12:24:41.337084 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:24:41.592972 master-0 kubenswrapper[13984]: I0312 12:24:41.592927 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:41.593791 master-0 kubenswrapper[13984]: I0312 12:24:41.593772 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:41.609242 master-0 kubenswrapper[13984]: I0312 12:24:41.609201 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:24:41.956418 master-0 kubenswrapper[13984]: I0312 12:24:41.956357 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:24:41.956725 master-0 kubenswrapper[13984]: I0312 12:24:41.956670 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" containerID="cri-o://faf97dd3763176aab30d820af53bb9c984317f76cbc30d430d1b3c10226441f2" gracePeriod=5 Mar 12 12:24:41.973824 master-0 kubenswrapper[13984]: I0312 12:24:41.973771 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-56c75bf4c7-5rbw4" Mar 12 12:24:42.261754 master-0 kubenswrapper[13984]: I0312 12:24:42.261628 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-7849849f76-86f2r" Mar 12 12:24:42.282580 master-0 kubenswrapper[13984]: I0312 12:24:42.282517 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:42.611395 master-0 kubenswrapper[13984]: I0312 12:24:42.611270 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:42.618151 master-0 kubenswrapper[13984]: I0312 12:24:42.618106 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:24:42.877916 master-0 kubenswrapper[13984]: I0312 12:24:42.877758 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:42.878170 master-0 kubenswrapper[13984]: I0312 12:24:42.878002 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:42.878170 master-0 kubenswrapper[13984]: I0312 12:24:42.878021 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:42.908076 master-0 kubenswrapper[13984]: I0312 12:24:42.907981 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:24:43.289599 master-0 kubenswrapper[13984]: I0312 12:24:43.289547 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:24:44.140679 master-0 kubenswrapper[13984]: I0312 12:24:44.140607 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:44.145585 master-0 kubenswrapper[13984]: I0312 12:24:44.145465 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:24:44.359033 master-0 kubenswrapper[13984]: I0312 12:24:44.358970 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:44.371914 master-0 kubenswrapper[13984]: I0312 12:24:44.371846 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-sp7w9" Mar 12 12:24:44.740741 master-0 kubenswrapper[13984]: I0312 12:24:44.739636 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:44.743251 master-0 kubenswrapper[13984]: I0312 12:24:44.743181 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:24:44.872365 master-0 kubenswrapper[13984]: I0312 12:24:44.872315 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:44.872570 master-0 kubenswrapper[13984]: I0312 12:24:44.872534 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:44.872633 master-0 kubenswrapper[13984]: E0312 12:24:44.872470 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:44.872633 master-0 kubenswrapper[13984]: E0312 12:24:44.872592 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:44.872729 master-0 kubenswrapper[13984]: E0312 12:24:44.872642 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:24:52.872625845 +0000 UTC m=+25.070641347 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:44.872786 master-0 kubenswrapper[13984]: E0312 12:24:44.872743 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:44.872786 master-0 kubenswrapper[13984]: E0312 12:24:44.872761 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:44.872869 master-0 kubenswrapper[13984]: E0312 12:24:44.872803 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:52.872791803 +0000 UTC m=+25.070807305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:46.978969 master-0 kubenswrapper[13984]: I0312 12:24:46.978926 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-98xjv_a346ac54-02fe-417f-a49d-038e45b13a1d/authentication-operator/0.log" Mar 12 12:24:47.212298 master-0 kubenswrapper[13984]: I0312 12:24:47.212163 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-98xjv_a346ac54-02fe-417f-a49d-038e45b13a1d/authentication-operator/1.log" Mar 12 12:24:47.310407 master-0 kubenswrapper[13984]: I0312 12:24:47.310283 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 12 12:24:47.310407 master-0 kubenswrapper[13984]: I0312 12:24:47.310334 13984 generic.go:334] "Generic (PLEG): container finished" podID="f417e14665db2ffffa887ce21c9ff0ed" containerID="faf97dd3763176aab30d820af53bb9c984317f76cbc30d430d1b3c10226441f2" exitCode=137 Mar 12 12:24:47.797905 master-0 kubenswrapper[13984]: I0312 12:24:47.797858 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 12 12:24:47.798116 master-0 kubenswrapper[13984]: I0312 12:24:47.797948 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:47.907397 master-0 kubenswrapper[13984]: I0312 12:24:47.907343 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 12 12:24:47.907742 master-0 kubenswrapper[13984]: I0312 12:24:47.907666 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:47.907813 master-0 kubenswrapper[13984]: I0312 12:24:47.907716 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 12 12:24:47.907880 master-0 kubenswrapper[13984]: I0312 12:24:47.907849 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 12 12:24:47.907951 master-0 kubenswrapper[13984]: I0312 12:24:47.907930 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 12 12:24:47.908043 master-0 kubenswrapper[13984]: I0312 12:24:47.908010 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 12 12:24:47.908099 master-0 kubenswrapper[13984]: I0312 12:24:47.908034 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests" (OuterVolumeSpecName: "manifests") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:47.908156 master-0 kubenswrapper[13984]: I0312 12:24:47.908102 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log" (OuterVolumeSpecName: "var-log") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:47.908242 master-0 kubenswrapper[13984]: I0312 12:24:47.908219 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:47.908647 master-0 kubenswrapper[13984]: I0312 12:24:47.908612 13984 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:47.908647 master-0 kubenswrapper[13984]: I0312 12:24:47.908643 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:47.908761 master-0 kubenswrapper[13984]: I0312 12:24:47.908656 13984 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:47.908761 master-0 kubenswrapper[13984]: I0312 12:24:47.908669 13984 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:47.910123 master-0 kubenswrapper[13984]: I0312 12:24:47.910087 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-56c75bf4c7-5rbw4_720101f1-0833-45af-a5b7-4910ece2a589/fix-audit-permissions/0.log" Mar 12 12:24:47.913919 master-0 kubenswrapper[13984]: I0312 12:24:47.913840 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:24:47.986931 master-0 kubenswrapper[13984]: I0312 12:24:47.986880 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f417e14665db2ffffa887ce21c9ff0ed" path="/var/lib/kubelet/pods/f417e14665db2ffffa887ce21c9ff0ed/volumes" Mar 12 12:24:47.987414 master-0 kubenswrapper[13984]: I0312 12:24:47.987121 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 12 12:24:48.009831 master-0 kubenswrapper[13984]: I0312 12:24:48.009694 13984 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:24:48.317455 master-0 kubenswrapper[13984]: I0312 12:24:48.317324 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 12 12:24:48.317455 master-0 kubenswrapper[13984]: I0312 12:24:48.317430 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:24:49.148192 master-0 kubenswrapper[13984]: I0312 12:24:49.148070 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-56c75bf4c7-5rbw4_720101f1-0833-45af-a5b7-4910ece2a589/oauth-apiserver/0.log" Mar 12 12:24:50.989087 master-0 kubenswrapper[13984]: E0312 12:24:50.989008 13984 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="3.01s" Mar 12 12:24:50.989601 master-0 kubenswrapper[13984]: I0312 12:24:50.989228 13984 scope.go:117] "RemoveContainer" containerID="faf97dd3763176aab30d820af53bb9c984317f76cbc30d430d1b3c10226441f2" Mar 12 12:24:51.001800 master-0 kubenswrapper[13984]: I0312 12:24:51.001755 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 12 12:24:51.237737 master-0 kubenswrapper[13984]: I0312 12:24:51.237617 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:24:51.237737 master-0 kubenswrapper[13984]: I0312 12:24:51.237691 13984 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="21d576a5-cf82-46ee-868b-d297ac37cf1e" Mar 12 12:24:51.237737 master-0 kubenswrapper[13984]: I0312 12:24:51.237733 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:24:51.238227 master-0 kubenswrapper[13984]: I0312 12:24:51.237756 13984 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="21d576a5-cf82-46ee-868b-d297ac37cf1e" Mar 12 12:24:51.239597 master-0 kubenswrapper[13984]: I0312 12:24:51.239366 13984 scope.go:117] "RemoveContainer" containerID="da81346b3001ee955f12da9dabc7c3b2591cdb7d6466508682550d487c8ccccc" Mar 12 12:24:51.243583 master-0 kubenswrapper[13984]: I0312 12:24:51.243467 13984 scope.go:117] "RemoveContainer" containerID="c7a8cac38562711df5097b3495b3fcbf36ebc31ee34e5cfed0a5c587cefecb36" Mar 12 12:24:51.245293 master-0 kubenswrapper[13984]: I0312 12:24:51.245244 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-7nb6b_ab087440-bdf2-4e2f-9a5a-434d50a2329a/etcd-operator/0.log" Mar 12 12:24:51.306534 master-0 kubenswrapper[13984]: I0312 12:24:51.303014 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-7nb6b_ab087440-bdf2-4e2f-9a5a-434d50a2329a/etcd-operator/1.log" Mar 12 12:24:51.320510 master-0 kubenswrapper[13984]: I0312 12:24:51.318556 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/setup/0.log" Mar 12 12:24:51.384400 master-0 kubenswrapper[13984]: I0312 12:24:51.384226 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-ensure-env-vars/0.log" Mar 12 12:24:51.407223 master-0 kubenswrapper[13984]: I0312 12:24:51.406176 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-resources-copy/0.log" Mar 12 12:24:51.421577 master-0 kubenswrapper[13984]: I0312 12:24:51.419011 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 12 12:24:51.449957 master-0 kubenswrapper[13984]: I0312 12:24:51.449907 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 12 12:24:51.461367 master-0 kubenswrapper[13984]: I0312 12:24:51.461335 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 12 12:24:51.467378 master-0 kubenswrapper[13984]: I0312 12:24:51.467353 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-readyz/0.log" Mar 12 12:24:51.476665 master-0 kubenswrapper[13984]: I0312 12:24:51.476628 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 12 12:24:51.484685 master-0 kubenswrapper[13984]: I0312 12:24:51.484650 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1/installer/0.log" Mar 12 12:24:51.492930 master-0 kubenswrapper[13984]: I0312 12:24:51.492839 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-mpxz4_8a121d0d-d201-446b-97a1-e2414e599f4a/kube-apiserver-operator/0.log" Mar 12 12:24:51.651845 master-0 kubenswrapper[13984]: I0312 12:24:51.651688 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-mpxz4_8a121d0d-d201-446b-97a1-e2414e599f4a/kube-apiserver-operator/1.log" Mar 12 12:24:51.857065 master-0 kubenswrapper[13984]: I0312 12:24:51.856936 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_48e7be9a-921a-42b0-b9ae-b7ffd28c89a4/installer/0.log" Mar 12 12:24:52.052012 master-0 kubenswrapper[13984]: I0312 12:24:52.051953 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/setup/0.log" Mar 12 12:24:52.261206 master-0 kubenswrapper[13984]: I0312 12:24:52.261122 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver/0.log" Mar 12 12:24:52.344340 master-0 kubenswrapper[13984]: I0312 12:24:52.344249 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/2.log" Mar 12 12:24:52.345054 master-0 kubenswrapper[13984]: I0312 12:24:52.344995 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerStarted","Data":"45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad"} Mar 12 12:24:52.348786 master-0 kubenswrapper[13984]: I0312 12:24:52.348473 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/2.log" Mar 12 12:24:52.349530 master-0 kubenswrapper[13984]: I0312 12:24:52.349332 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerStarted","Data":"f0fb0da09e8c36b2b08b8d5e0695af6a8eeabc526c30a2a41b1610b3f1603e7d"} Mar 12 12:24:52.457770 master-0 kubenswrapper[13984]: I0312 12:24:52.457708 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 12 12:24:52.659056 master-0 kubenswrapper[13984]: I0312 12:24:52.658875 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-regeneration-controller/0.log" Mar 12 12:24:52.853269 master-0 kubenswrapper[13984]: I0312 12:24:52.853179 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-insecure-readyz/0.log" Mar 12 12:24:52.973441 master-0 kubenswrapper[13984]: I0312 12:24:52.973331 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:24:52.973885 master-0 kubenswrapper[13984]: E0312 12:24:52.973809 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:52.973885 master-0 kubenswrapper[13984]: E0312 12:24:52.973887 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:52.974219 master-0 kubenswrapper[13984]: I0312 12:24:52.974177 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:24:52.974522 master-0 kubenswrapper[13984]: E0312 12:24:52.974326 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:25:08.974291424 +0000 UTC m=+41.172306936 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:24:52.974684 master-0 kubenswrapper[13984]: E0312 12:24:52.974411 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:52.974684 master-0 kubenswrapper[13984]: E0312 12:24:52.974623 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:52.974936 master-0 kubenswrapper[13984]: E0312 12:24:52.974838 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:25:08.974742994 +0000 UTC m=+41.172758526 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:24:53.056445 master-0 kubenswrapper[13984]: I0312 12:24:53.056401 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-check-endpoints/0.log" Mar 12 12:24:53.460369 master-0 kubenswrapper[13984]: I0312 12:24:53.460279 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_a97fcd56-aa52-414a-b370-154c1b34c1ed/installer/0.log" Mar 12 12:24:53.659356 master-0 kubenswrapper[13984]: I0312 12:24:53.659279 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:24:53.861249 master-0 kubenswrapper[13984]: I0312 12:24:53.861112 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/0.log" Mar 12 12:24:54.054917 master-0 kubenswrapper[13984]: I0312 12:24:54.054821 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager-cert-syncer/0.log" Mar 12 12:24:54.254132 master-0 kubenswrapper[13984]: I0312 12:24:54.254071 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager-recovery-controller/0.log" Mar 12 12:24:54.459749 master-0 kubenswrapper[13984]: I0312 12:24:54.459672 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-d4htx_6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/kube-controller-manager-operator/0.log" Mar 12 12:24:54.652446 master-0 kubenswrapper[13984]: I0312 12:24:54.652247 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-d4htx_6571f5e5-07ee-4e6c-a8ad-277bc52e35ee/kube-controller-manager-operator/1.log" Mar 12 12:24:54.882005 master-0 kubenswrapper[13984]: I0312 12:24:54.881920 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_a1a56802af72ce1aac6b5077f1695ac0/kube-scheduler/1.log" Mar 12 12:24:55.056000 master-0 kubenswrapper[13984]: I0312 12:24:55.055903 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_bootstrap-kube-scheduler-master-0_a1a56802af72ce1aac6b5077f1695ac0/kube-scheduler/2.log" Mar 12 12:24:55.260510 master-0 kubenswrapper[13984]: I0312 12:24:55.260400 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_b7aa62dd-2de4-4511-a7e7-27f45fe97cc1/installer/0.log" Mar 12 12:24:55.455254 master-0 kubenswrapper[13984]: I0312 12:24:55.455184 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-2kkmf_9b960fe2-d59e-4ee1-bd9d-455b46753cb9/kube-scheduler-operator-container/0.log" Mar 12 12:24:55.655183 master-0 kubenswrapper[13984]: I0312 12:24:55.655093 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler-operator_openshift-kube-scheduler-operator-5c74bfc494-2kkmf_9b960fe2-d59e-4ee1-bd9d-455b46753cb9/kube-scheduler-operator-container/1.log" Mar 12 12:24:55.870357 master-0 kubenswrapper[13984]: I0312 12:24:55.870261 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-gc2gv_ea80247e-b4dd-45dc-8255-6e68508c8480/openshift-apiserver-operator/0.log" Mar 12 12:24:56.164283 master-0 kubenswrapper[13984]: I0312 12:24:56.164185 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver-operator_openshift-apiserver-operator-799b6db4d7-gc2gv_ea80247e-b4dd-45dc-8255-6e68508c8480/openshift-apiserver-operator/1.log" Mar 12 12:24:56.230394 master-0 kubenswrapper[13984]: I0312 12:24:56.230350 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-rnnjn"] Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: E0312 12:24:56.230536 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: I0312 12:24:56.230548 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: E0312 12:24:56.230561 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: I0312 12:24:56.230574 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: E0312 12:24:56.230585 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: I0312 12:24:56.230591 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: E0312 12:24:56.230603 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: I0312 12:24:56.230608 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: E0312 12:24:56.230615 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" containerName="installer" Mar 12 12:24:56.230619 master-0 kubenswrapper[13984]: I0312 12:24:56.230621 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: E0312 12:24:56.230632 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230638 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: E0312 12:24:56.230646 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230652 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: E0312 12:24:56.230663 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230669 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: E0312 12:24:56.230678 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230683 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230759 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7c1a86e-0ad7-4978-80ae-163dbc44fafb" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230773 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7aa62dd-2de4-4511-a7e7-27f45fe97cc1" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230790 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97fcd56-aa52-414a-b370-154c1b34c1ed" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230796 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a59a6bb7-f966-4208-ba85-452095404891" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230805 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="33be2f5b-c837-4a07-8ad9-4400a36f53c1" containerName="assisted-installer-controller" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230813 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230827 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cce66fc4-350d-4a86-acb2-d8d672cf2491" containerName="prober" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230834 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1" containerName="installer" Mar 12 12:24:56.230887 master-0 kubenswrapper[13984]: I0312 12:24:56.230842 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" containerName="installer" Mar 12 12:24:56.231457 master-0 kubenswrapper[13984]: I0312 12:24:56.231199 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.234158 master-0 kubenswrapper[13984]: I0312 12:24:56.234117 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 12:24:56.234653 master-0 kubenswrapper[13984]: I0312 12:24:56.234617 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 12:24:56.234928 master-0 kubenswrapper[13984]: I0312 12:24:56.234899 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 12:24:56.235005 master-0 kubenswrapper[13984]: I0312 12:24:56.234962 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 12:24:56.235087 master-0 kubenswrapper[13984]: I0312 12:24:56.235057 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 12:24:56.237007 master-0 kubenswrapper[13984]: I0312 12:24:56.236967 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m"] Mar 12 12:24:56.237739 master-0 kubenswrapper[13984]: I0312 12:24:56.237715 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.240724 master-0 kubenswrapper[13984]: I0312 12:24:56.240688 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 12:24:56.240789 master-0 kubenswrapper[13984]: I0312 12:24:56.240764 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rvcb2" Mar 12 12:24:56.272291 master-0 kubenswrapper[13984]: I0312 12:24:56.272250 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-7849849f76-86f2r_c873b656-d2aa-4d0e-aa22-9f8d35186473/fix-audit-permissions/0.log" Mar 12 12:24:56.273935 master-0 kubenswrapper[13984]: I0312 12:24:56.273874 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-rnnjn"] Mar 12 12:24:56.292573 master-0 kubenswrapper[13984]: I0312 12:24:56.292531 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m"] Mar 12 12:24:56.420279 master-0 kubenswrapper[13984]: I0312 12:24:56.420144 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9713325e-711d-4255-a61b-4b760be05cde-apiservice-cert\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.420279 master-0 kubenswrapper[13984]: I0312 12:24:56.420197 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c424f946-e2fe-4450-816b-b79640269ff5-serving-cert\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.420279 master-0 kubenswrapper[13984]: I0312 12:24:56.420223 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.420279 master-0 kubenswrapper[13984]: I0312 12:24:56.420266 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-config\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.420712 master-0 kubenswrapper[13984]: I0312 12:24:56.420293 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9x2bx\" (UniqueName: \"kubernetes.io/projected/c424f946-e2fe-4450-816b-b79640269ff5-kube-api-access-9x2bx\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.420712 master-0 kubenswrapper[13984]: I0312 12:24:56.420376 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9713325e-711d-4255-a61b-4b760be05cde-webhook-cert\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.420712 master-0 kubenswrapper[13984]: I0312 12:24:56.420400 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9713325e-711d-4255-a61b-4b760be05cde-tmpfs\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.420712 master-0 kubenswrapper[13984]: I0312 12:24:56.420415 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmx64\" (UniqueName: \"kubernetes.io/projected/9713325e-711d-4255-a61b-4b760be05cde-kube-api-access-xmx64\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.461372 master-0 kubenswrapper[13984]: I0312 12:24:56.461288 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-7849849f76-86f2r_c873b656-d2aa-4d0e-aa22-9f8d35186473/openshift-apiserver/0.log" Mar 12 12:24:56.521360 master-0 kubenswrapper[13984]: I0312 12:24:56.521287 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9713325e-711d-4255-a61b-4b760be05cde-webhook-cert\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.521360 master-0 kubenswrapper[13984]: I0312 12:24:56.521350 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9713325e-711d-4255-a61b-4b760be05cde-tmpfs\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.521607 master-0 kubenswrapper[13984]: I0312 12:24:56.521580 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmx64\" (UniqueName: \"kubernetes.io/projected/9713325e-711d-4255-a61b-4b760be05cde-kube-api-access-xmx64\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.521756 master-0 kubenswrapper[13984]: I0312 12:24:56.521724 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9713325e-711d-4255-a61b-4b760be05cde-apiservice-cert\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.521799 master-0 kubenswrapper[13984]: I0312 12:24:56.521774 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c424f946-e2fe-4450-816b-b79640269ff5-serving-cert\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.521839 master-0 kubenswrapper[13984]: I0312 12:24:56.521803 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.521887 master-0 kubenswrapper[13984]: I0312 12:24:56.521840 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-config\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.521887 master-0 kubenswrapper[13984]: I0312 12:24:56.521848 13984 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 12 12:24:56.521887 master-0 kubenswrapper[13984]: I0312 12:24:56.521865 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2bx\" (UniqueName: \"kubernetes.io/projected/c424f946-e2fe-4450-816b-b79640269ff5-kube-api-access-9x2bx\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.522126 master-0 kubenswrapper[13984]: E0312 12:24:56.522092 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:57.022067918 +0000 UTC m=+29.220083500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:24:56.523514 master-0 kubenswrapper[13984]: I0312 12:24:56.522502 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/9713325e-711d-4255-a61b-4b760be05cde-tmpfs\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.523514 master-0 kubenswrapper[13984]: I0312 12:24:56.522862 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-config\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.524698 master-0 kubenswrapper[13984]: I0312 12:24:56.524657 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c424f946-e2fe-4450-816b-b79640269ff5-serving-cert\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:56.524913 master-0 kubenswrapper[13984]: I0312 12:24:56.524876 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9713325e-711d-4255-a61b-4b760be05cde-webhook-cert\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.525209 master-0 kubenswrapper[13984]: I0312 12:24:56.525169 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9713325e-711d-4255-a61b-4b760be05cde-apiservice-cert\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.556202 master-0 kubenswrapper[13984]: E0312 12:24:56.556156 13984 projected.go:288] Couldn't get configMap openshift-console-operator/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 12 12:24:56.556202 master-0 kubenswrapper[13984]: E0312 12:24:56.556204 13984 projected.go:194] Error preparing data for projected volume kube-api-access-9x2bx for pod openshift-console-operator/console-operator-6c7fb6b958-rnnjn: configmap "kube-root-ca.crt" not found Mar 12 12:24:56.556396 master-0 kubenswrapper[13984]: E0312 12:24:56.556270 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c424f946-e2fe-4450-816b-b79640269ff5-kube-api-access-9x2bx podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:57.056245101 +0000 UTC m=+29.254260593 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9x2bx" (UniqueName: "kubernetes.io/projected/c424f946-e2fe-4450-816b-b79640269ff5-kube-api-access-9x2bx") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap "kube-root-ca.crt" not found Mar 12 12:24:56.558635 master-0 kubenswrapper[13984]: I0312 12:24:56.558586 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmx64\" (UniqueName: \"kubernetes.io/projected/9713325e-711d-4255-a61b-4b760be05cde-kube-api-access-xmx64\") pod \"packageserver-9df8858fc-8d24m\" (UID: \"9713325e-711d-4255-a61b-4b760be05cde\") " pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.569814 master-0 kubenswrapper[13984]: I0312 12:24:56.569772 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:56.670572 master-0 kubenswrapper[13984]: I0312 12:24:56.670130 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-apiserver_apiserver-7849849f76-86f2r_c873b656-d2aa-4d0e-aa22-9f8d35186473/openshift-apiserver-check-endpoints/0.log" Mar 12 12:24:56.997284 master-0 kubenswrapper[13984]: I0312 12:24:56.997213 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-7nb6b_ab087440-bdf2-4e2f-9a5a-434d50a2329a/etcd-operator/0.log" Mar 12 12:24:57.026377 master-0 kubenswrapper[13984]: I0312 12:24:57.026270 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:57.026815 master-0 kubenswrapper[13984]: E0312 12:24:57.026543 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:24:58.026514945 +0000 UTC m=+30.224530477 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:24:57.128231 master-0 kubenswrapper[13984]: I0312 12:24:57.128032 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9x2bx\" (UniqueName: \"kubernetes.io/projected/c424f946-e2fe-4450-816b-b79640269ff5-kube-api-access-9x2bx\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:57.131350 master-0 kubenswrapper[13984]: I0312 12:24:57.131290 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9x2bx\" (UniqueName: \"kubernetes.io/projected/c424f946-e2fe-4450-816b-b79640269ff5-kube-api-access-9x2bx\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:57.227459 master-0 kubenswrapper[13984]: I0312 12:24:57.227411 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m"] Mar 12 12:24:57.233196 master-0 kubenswrapper[13984]: I0312 12:24:57.233150 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-7nb6b_ab087440-bdf2-4e2f-9a5a-434d50a2329a/etcd-operator/1.log" Mar 12 12:24:57.241353 master-0 kubenswrapper[13984]: W0312 12:24:57.241062 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9713325e_711d_4255_a61b_4b760be05cde.slice/crio-eff987888bbd9bffcc56e59b6ac6f2e0a16cd04e4ff24ceb37c1650a6adc8262 WatchSource:0}: Error finding container eff987888bbd9bffcc56e59b6ac6f2e0a16cd04e4ff24ceb37c1650a6adc8262: Status 404 returned error can't find the container with id eff987888bbd9bffcc56e59b6ac6f2e0a16cd04e4ff24ceb37c1650a6adc8262 Mar 12 12:24:57.259810 master-0 kubenswrapper[13984]: I0312 12:24:57.259716 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-cg7rd_b5890f0c-cebe-4788-89f7-27568d875741/openshift-controller-manager-operator/0.log" Mar 12 12:24:57.380904 master-0 kubenswrapper[13984]: I0312 12:24:57.380842 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" event={"ID":"9713325e-711d-4255-a61b-4b760be05cde","Type":"ContainerStarted","Data":"022d3d08eba4164b07849aefa32ddb8b84ca267eb76de686fee16b3d8f77a9bc"} Mar 12 12:24:57.380904 master-0 kubenswrapper[13984]: I0312 12:24:57.380907 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" event={"ID":"9713325e-711d-4255-a61b-4b760be05cde","Type":"ContainerStarted","Data":"eff987888bbd9bffcc56e59b6ac6f2e0a16cd04e4ff24ceb37c1650a6adc8262"} Mar 12 12:24:57.381387 master-0 kubenswrapper[13984]: I0312 12:24:57.381197 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:57.384017 master-0 kubenswrapper[13984]: I0312 12:24:57.383957 13984 patch_prober.go:28] interesting pod/packageserver-9df8858fc-8d24m container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.62:5443/healthz\": dial tcp 10.128.0.62:5443: connect: connection refused" start-of-body= Mar 12 12:24:57.384609 master-0 kubenswrapper[13984]: I0312 12:24:57.384082 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" podUID="9713325e-711d-4255-a61b-4b760be05cde" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.62:5443/healthz\": dial tcp 10.128.0.62:5443: connect: connection refused" Mar 12 12:24:57.399355 master-0 kubenswrapper[13984]: I0312 12:24:57.399280 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" podStartSLOduration=1.39926141 podStartE2EDuration="1.39926141s" podCreationTimestamp="2026-03-12 12:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:24:57.39790508 +0000 UTC m=+29.595920562" watchObservedRunningTime="2026-03-12 12:24:57.39926141 +0000 UTC m=+29.597276922" Mar 12 12:24:57.455253 master-0 kubenswrapper[13984]: I0312 12:24:57.455157 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-cg7rd_b5890f0c-cebe-4788-89f7-27568d875741/openshift-controller-manager-operator/1.log" Mar 12 12:24:57.655632 master-0 kubenswrapper[13984]: I0312 12:24:57.655514 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager_controller-manager-6f87d47d96-c24tv_ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/controller-manager/0.log" Mar 12 12:24:57.857786 master-0 kubenswrapper[13984]: I0312 12:24:57.857745 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-7665b44c8d-2lgnf_021b22e3-b4c5-426d-b761-181f1e54175d/route-controller-manager/0.log" Mar 12 12:24:58.039371 master-0 kubenswrapper[13984]: I0312 12:24:58.039315 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:24:58.039598 master-0 kubenswrapper[13984]: E0312 12:24:58.039456 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:25:00.039437488 +0000 UTC m=+32.237452980 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:24:58.052098 master-0 kubenswrapper[13984]: I0312 12:24:58.052068 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-nwk7v_d961a5f0-84b7-47d7-846b-238475947121/catalog-operator/0.log" Mar 12 12:24:58.254296 master-0 kubenswrapper[13984]: I0312 12:24:58.254241 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-nwk7v_d961a5f0-84b7-47d7-846b-238475947121/catalog-operator/1.log" Mar 12 12:24:58.388875 master-0 kubenswrapper[13984]: I0312 12:24:58.388768 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-9df8858fc-8d24m" Mar 12 12:24:58.454577 master-0 kubenswrapper[13984]: I0312 12:24:58.452786 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-sp7w9_9bc7dea3-1868-488c-a34b-288cde3acd35/olm-operator/0.log" Mar 12 12:24:58.657900 master-0 kubenswrapper[13984]: I0312 12:24:58.657738 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-sp7w9_9bc7dea3-1868-488c-a34b-288cde3acd35/olm-operator/1.log" Mar 12 12:24:58.851297 master-0 kubenswrapper[13984]: I0312 12:24:58.851247 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-tcc85_d1d16bbc-778b-4fc1-abb2-b43e79a7c532/kube-rbac-proxy/0.log" Mar 12 12:24:59.054948 master-0 kubenswrapper[13984]: I0312 12:24:59.054884 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-tcc85_d1d16bbc-778b-4fc1-abb2-b43e79a7c532/package-server-manager/0.log" Mar 12 12:25:00.066503 master-0 kubenswrapper[13984]: I0312 12:25:00.066419 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:25:00.067091 master-0 kubenswrapper[13984]: E0312 12:25:00.066773 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:25:04.066740725 +0000 UTC m=+36.264756237 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:25:00.345498 master-0 kubenswrapper[13984]: I0312 12:25:00.345334 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:25:00.345706 master-0 kubenswrapper[13984]: I0312 12:25:00.345554 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:25:00.381842 master-0 kubenswrapper[13984]: I0312 12:25:00.381797 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l5d2w" Mar 12 12:25:04.114200 master-0 kubenswrapper[13984]: I0312 12:25:04.114100 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:25:04.114902 master-0 kubenswrapper[13984]: E0312 12:25:04.114265 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:25:12.114242128 +0000 UTC m=+44.312257620 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:25:06.473433 master-0 kubenswrapper[13984]: I0312 12:25:06.473362 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-w99c4"] Mar 12 12:25:06.474387 master-0 kubenswrapper[13984]: I0312 12:25:06.474162 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.479042 master-0 kubenswrapper[13984]: I0312 12:25:06.478987 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-lrwcw" Mar 12 12:25:06.479290 master-0 kubenswrapper[13984]: I0312 12:25:06.479192 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 12:25:06.639839 master-0 kubenswrapper[13984]: I0312 12:25:06.639781 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b840211-d5ff-4616-aa9d-50a5615e0f59-host\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.639839 master-0 kubenswrapper[13984]: I0312 12:25:06.639849 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b840211-d5ff-4616-aa9d-50a5615e0f59-serviceca\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.640090 master-0 kubenswrapper[13984]: I0312 12:25:06.639882 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf7mj\" (UniqueName: \"kubernetes.io/projected/2b840211-d5ff-4616-aa9d-50a5615e0f59-kube-api-access-mf7mj\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.740861 master-0 kubenswrapper[13984]: I0312 12:25:06.740745 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b840211-d5ff-4616-aa9d-50a5615e0f59-host\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.740861 master-0 kubenswrapper[13984]: I0312 12:25:06.740807 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b840211-d5ff-4616-aa9d-50a5615e0f59-serviceca\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.740861 master-0 kubenswrapper[13984]: I0312 12:25:06.740854 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf7mj\" (UniqueName: \"kubernetes.io/projected/2b840211-d5ff-4616-aa9d-50a5615e0f59-kube-api-access-mf7mj\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.741098 master-0 kubenswrapper[13984]: I0312 12:25:06.741047 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2b840211-d5ff-4616-aa9d-50a5615e0f59-host\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.741562 master-0 kubenswrapper[13984]: I0312 12:25:06.741509 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/2b840211-d5ff-4616-aa9d-50a5615e0f59-serviceca\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.768349 master-0 kubenswrapper[13984]: I0312 12:25:06.768291 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf7mj\" (UniqueName: \"kubernetes.io/projected/2b840211-d5ff-4616-aa9d-50a5615e0f59-kube-api-access-mf7mj\") pod \"node-ca-w99c4\" (UID: \"2b840211-d5ff-4616-aa9d-50a5615e0f59\") " pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.798435 master-0 kubenswrapper[13984]: I0312 12:25:06.798369 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-w99c4" Mar 12 12:25:06.821811 master-0 kubenswrapper[13984]: I0312 12:25:06.821792 13984 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:25:07.462579 master-0 kubenswrapper[13984]: I0312 12:25:07.462460 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w99c4" event={"ID":"2b840211-d5ff-4616-aa9d-50a5615e0f59","Type":"ContainerStarted","Data":"60949767dfbb8881e79582bb0a26d2bcf28876a5d3c635d10768f6b03975a608"} Mar 12 12:25:08.975619 master-0 kubenswrapper[13984]: I0312 12:25:08.975568 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: I0312 12:25:08.975621 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: E0312 12:25:08.975781 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: E0312 12:25:08.975811 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: E0312 12:25:08.975840 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: E0312 12:25:08.975863 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: E0312 12:25:08.975867 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:25:40.975846718 +0000 UTC m=+73.173862260 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:25:08.976064 master-0 kubenswrapper[13984]: E0312 12:25:08.975890 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:25:40.97588202 +0000 UTC m=+73.173897512 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:25:09.476437 master-0 kubenswrapper[13984]: I0312 12:25:09.476339 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-w99c4" event={"ID":"2b840211-d5ff-4616-aa9d-50a5615e0f59","Type":"ContainerStarted","Data":"3b1d34572328d35743fda5459999d50a60a214a8fc171d40286f76da03673447"} Mar 12 12:25:12.120091 master-0 kubenswrapper[13984]: I0312 12:25:12.119978 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:25:12.120979 master-0 kubenswrapper[13984]: E0312 12:25:12.120202 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:25:28.120166086 +0000 UTC m=+60.318181618 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:25:28.128163 master-0 kubenswrapper[13984]: I0312 12:25:28.128107 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:25:28.128824 master-0 kubenswrapper[13984]: E0312 12:25:28.128262 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:00.128244986 +0000 UTC m=+92.326260468 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:25:30.795250 master-0 kubenswrapper[13984]: I0312 12:25:30.795171 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-w99c4" podStartSLOduration=22.871974897 podStartE2EDuration="24.795152506s" podCreationTimestamp="2026-03-12 12:25:06 +0000 UTC" firstStartedPulling="2026-03-12 12:25:06.821707791 +0000 UTC m=+39.019723273" lastFinishedPulling="2026-03-12 12:25:08.74488539 +0000 UTC m=+40.942900882" observedRunningTime="2026-03-12 12:25:09.502882699 +0000 UTC m=+41.700898261" watchObservedRunningTime="2026-03-12 12:25:30.795152506 +0000 UTC m=+62.993167998" Mar 12 12:25:30.798008 master-0 kubenswrapper[13984]: I0312 12:25:30.797953 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mjrlw"] Mar 12 12:25:30.799613 master-0 kubenswrapper[13984]: I0312 12:25:30.799580 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:30.799982 master-0 kubenswrapper[13984]: I0312 12:25:30.799949 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ml9kh"] Mar 12 12:25:30.801102 master-0 kubenswrapper[13984]: I0312 12:25:30.801076 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-2blvc" Mar 12 12:25:30.801223 master-0 kubenswrapper[13984]: I0312 12:25:30.801207 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:30.803103 master-0 kubenswrapper[13984]: I0312 12:25:30.803071 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-5jzsx" Mar 12 12:25:30.811385 master-0 kubenswrapper[13984]: I0312 12:25:30.811341 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mjrlw"] Mar 12 12:25:30.822182 master-0 kubenswrapper[13984]: I0312 12:25:30.822139 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml9kh"] Mar 12 12:25:30.966958 master-0 kubenswrapper[13984]: I0312 12:25:30.966301 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8x4v\" (UniqueName: \"kubernetes.io/projected/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-kube-api-access-d8x4v\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:30.966958 master-0 kubenswrapper[13984]: I0312 12:25:30.966368 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23004971-d185-418c-b0d9-5c0891456288-catalog-content\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:30.966958 master-0 kubenswrapper[13984]: I0312 12:25:30.966396 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23004971-d185-418c-b0d9-5c0891456288-utilities\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:30.966958 master-0 kubenswrapper[13984]: I0312 12:25:30.966466 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bphr\" (UniqueName: \"kubernetes.io/projected/23004971-d185-418c-b0d9-5c0891456288-kube-api-access-9bphr\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:30.966958 master-0 kubenswrapper[13984]: I0312 12:25:30.966535 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-catalog-content\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:30.966958 master-0 kubenswrapper[13984]: I0312 12:25:30.966561 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-utilities\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.068076 master-0 kubenswrapper[13984]: I0312 12:25:31.067709 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bphr\" (UniqueName: \"kubernetes.io/projected/23004971-d185-418c-b0d9-5c0891456288-kube-api-access-9bphr\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.068076 master-0 kubenswrapper[13984]: I0312 12:25:31.067802 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-catalog-content\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.068076 master-0 kubenswrapper[13984]: I0312 12:25:31.067828 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-utilities\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.068076 master-0 kubenswrapper[13984]: I0312 12:25:31.067876 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8x4v\" (UniqueName: \"kubernetes.io/projected/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-kube-api-access-d8x4v\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.068076 master-0 kubenswrapper[13984]: I0312 12:25:31.068032 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23004971-d185-418c-b0d9-5c0891456288-catalog-content\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.068403 master-0 kubenswrapper[13984]: I0312 12:25:31.068239 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23004971-d185-418c-b0d9-5c0891456288-utilities\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.068674 master-0 kubenswrapper[13984]: I0312 12:25:31.068623 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-catalog-content\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.068674 master-0 kubenswrapper[13984]: I0312 12:25:31.068656 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23004971-d185-418c-b0d9-5c0891456288-catalog-content\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.068778 master-0 kubenswrapper[13984]: I0312 12:25:31.068721 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23004971-d185-418c-b0d9-5c0891456288-utilities\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.068839 master-0 kubenswrapper[13984]: I0312 12:25:31.068785 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-utilities\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.088379 master-0 kubenswrapper[13984]: I0312 12:25:31.088318 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bphr\" (UniqueName: \"kubernetes.io/projected/23004971-d185-418c-b0d9-5c0891456288-kube-api-access-9bphr\") pod \"certified-operators-mjrlw\" (UID: \"23004971-d185-418c-b0d9-5c0891456288\") " pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.089390 master-0 kubenswrapper[13984]: I0312 12:25:31.089351 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8x4v\" (UniqueName: \"kubernetes.io/projected/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-kube-api-access-d8x4v\") pod \"community-operators-ml9kh\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.119816 master-0 kubenswrapper[13984]: I0312 12:25:31.119740 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:25:31.159264 master-0 kubenswrapper[13984]: I0312 12:25:31.151770 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:31.561133 master-0 kubenswrapper[13984]: I0312 12:25:31.561090 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mjrlw"] Mar 12 12:25:31.570380 master-0 kubenswrapper[13984]: W0312 12:25:31.570319 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23004971_d185_418c_b0d9_5c0891456288.slice/crio-ba9f1febb92f7ff88cffae8e9d1b9ec568ff8df779901ae7d8f088cc45ada8dc WatchSource:0}: Error finding container ba9f1febb92f7ff88cffae8e9d1b9ec568ff8df779901ae7d8f088cc45ada8dc: Status 404 returned error can't find the container with id ba9f1febb92f7ff88cffae8e9d1b9ec568ff8df779901ae7d8f088cc45ada8dc Mar 12 12:25:31.629748 master-0 kubenswrapper[13984]: I0312 12:25:31.629646 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrlw" event={"ID":"23004971-d185-418c-b0d9-5c0891456288","Type":"ContainerStarted","Data":"ba9f1febb92f7ff88cffae8e9d1b9ec568ff8df779901ae7d8f088cc45ada8dc"} Mar 12 12:25:31.659845 master-0 kubenswrapper[13984]: I0312 12:25:31.659790 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ml9kh"] Mar 12 12:25:31.671577 master-0 kubenswrapper[13984]: W0312 12:25:31.671509 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff8a2e30_c594_48cb_819d_38e71fa5dd3f.slice/crio-a9b06ab179a724c70ad0609aa0ab4a3a63e845c628bf0079acd7f222760c4fb7 WatchSource:0}: Error finding container a9b06ab179a724c70ad0609aa0ab4a3a63e845c628bf0079acd7f222760c4fb7: Status 404 returned error can't find the container with id a9b06ab179a724c70ad0609aa0ab4a3a63e845c628bf0079acd7f222760c4fb7 Mar 12 12:25:32.638366 master-0 kubenswrapper[13984]: I0312 12:25:32.638249 13984 generic.go:334] "Generic (PLEG): container finished" podID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerID="9ac4d7e055c899d286d147117c59aa4d515ecdc1c9e6a8705a4e7a7d3181cb44" exitCode=0 Mar 12 12:25:32.638366 master-0 kubenswrapper[13984]: I0312 12:25:32.638325 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9kh" event={"ID":"ff8a2e30-c594-48cb-819d-38e71fa5dd3f","Type":"ContainerDied","Data":"9ac4d7e055c899d286d147117c59aa4d515ecdc1c9e6a8705a4e7a7d3181cb44"} Mar 12 12:25:32.639400 master-0 kubenswrapper[13984]: I0312 12:25:32.638393 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9kh" event={"ID":"ff8a2e30-c594-48cb-819d-38e71fa5dd3f","Type":"ContainerStarted","Data":"a9b06ab179a724c70ad0609aa0ab4a3a63e845c628bf0079acd7f222760c4fb7"} Mar 12 12:25:32.640654 master-0 kubenswrapper[13984]: I0312 12:25:32.640585 13984 generic.go:334] "Generic (PLEG): container finished" podID="23004971-d185-418c-b0d9-5c0891456288" containerID="5c9d8637d886229d58e293eb6f477d0a1d457e67c8ae675a1abbc70349e4e936" exitCode=0 Mar 12 12:25:32.640736 master-0 kubenswrapper[13984]: I0312 12:25:32.640643 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrlw" event={"ID":"23004971-d185-418c-b0d9-5c0891456288","Type":"ContainerDied","Data":"5c9d8637d886229d58e293eb6f477d0a1d457e67c8ae675a1abbc70349e4e936"} Mar 12 12:25:32.966522 master-0 kubenswrapper[13984]: I0312 12:25:32.966427 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kjp42"] Mar 12 12:25:32.967649 master-0 kubenswrapper[13984]: I0312 12:25:32.967604 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:32.969194 master-0 kubenswrapper[13984]: I0312 12:25:32.969143 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-l6bmn" Mar 12 12:25:32.989731 master-0 kubenswrapper[13984]: I0312 12:25:32.989684 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjp42"] Mar 12 12:25:33.100825 master-0 kubenswrapper[13984]: I0312 12:25:33.100758 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2c7f45-1f2d-4774-b219-4b2d34910421-catalog-content\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.101207 master-0 kubenswrapper[13984]: I0312 12:25:33.100957 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkrf\" (UniqueName: \"kubernetes.io/projected/9a2c7f45-1f2d-4774-b219-4b2d34910421-kube-api-access-tqkrf\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.101207 master-0 kubenswrapper[13984]: I0312 12:25:33.101140 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2c7f45-1f2d-4774-b219-4b2d34910421-utilities\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.172917 master-0 kubenswrapper[13984]: I0312 12:25:33.172818 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fk9sm"] Mar 12 12:25:33.175000 master-0 kubenswrapper[13984]: I0312 12:25:33.174952 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.179107 master-0 kubenswrapper[13984]: I0312 12:25:33.179049 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-kpcrs" Mar 12 12:25:33.225038 master-0 kubenswrapper[13984]: I0312 12:25:33.183259 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fk9sm"] Mar 12 12:25:33.225038 master-0 kubenswrapper[13984]: I0312 12:25:33.201775 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2c7f45-1f2d-4774-b219-4b2d34910421-catalog-content\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.225038 master-0 kubenswrapper[13984]: I0312 12:25:33.201824 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkrf\" (UniqueName: \"kubernetes.io/projected/9a2c7f45-1f2d-4774-b219-4b2d34910421-kube-api-access-tqkrf\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.225038 master-0 kubenswrapper[13984]: I0312 12:25:33.201849 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2c7f45-1f2d-4774-b219-4b2d34910421-utilities\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.225038 master-0 kubenswrapper[13984]: I0312 12:25:33.202266 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a2c7f45-1f2d-4774-b219-4b2d34910421-utilities\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.225038 master-0 kubenswrapper[13984]: I0312 12:25:33.202481 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a2c7f45-1f2d-4774-b219-4b2d34910421-catalog-content\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.256931 master-0 kubenswrapper[13984]: I0312 12:25:33.256875 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkrf\" (UniqueName: \"kubernetes.io/projected/9a2c7f45-1f2d-4774-b219-4b2d34910421-kube-api-access-tqkrf\") pod \"redhat-marketplace-kjp42\" (UID: \"9a2c7f45-1f2d-4774-b219-4b2d34910421\") " pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.296065 master-0 kubenswrapper[13984]: I0312 12:25:33.296025 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:25:33.302835 master-0 kubenswrapper[13984]: I0312 12:25:33.302801 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6635c84-7bcc-48ce-a388-170ea85a8b4f-catalog-content\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.302951 master-0 kubenswrapper[13984]: I0312 12:25:33.302861 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6635c84-7bcc-48ce-a388-170ea85a8b4f-utilities\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.302951 master-0 kubenswrapper[13984]: I0312 12:25:33.302894 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zctr7\" (UniqueName: \"kubernetes.io/projected/a6635c84-7bcc-48ce-a388-170ea85a8b4f-kube-api-access-zctr7\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.406135 master-0 kubenswrapper[13984]: I0312 12:25:33.404413 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6635c84-7bcc-48ce-a388-170ea85a8b4f-catalog-content\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.406135 master-0 kubenswrapper[13984]: I0312 12:25:33.404507 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6635c84-7bcc-48ce-a388-170ea85a8b4f-utilities\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.406135 master-0 kubenswrapper[13984]: I0312 12:25:33.404540 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zctr7\" (UniqueName: \"kubernetes.io/projected/a6635c84-7bcc-48ce-a388-170ea85a8b4f-kube-api-access-zctr7\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.406135 master-0 kubenswrapper[13984]: I0312 12:25:33.404982 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6635c84-7bcc-48ce-a388-170ea85a8b4f-catalog-content\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.406135 master-0 kubenswrapper[13984]: I0312 12:25:33.405852 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6635c84-7bcc-48ce-a388-170ea85a8b4f-utilities\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.444565 master-0 kubenswrapper[13984]: I0312 12:25:33.444431 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zctr7\" (UniqueName: \"kubernetes.io/projected/a6635c84-7bcc-48ce-a388-170ea85a8b4f-kube-api-access-zctr7\") pod \"redhat-operators-fk9sm\" (UID: \"a6635c84-7bcc-48ce-a388-170ea85a8b4f\") " pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.552754 master-0 kubenswrapper[13984]: I0312 12:25:33.552266 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:25:33.714468 master-0 kubenswrapper[13984]: I0312 12:25:33.714407 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kjp42"] Mar 12 12:25:33.721648 master-0 kubenswrapper[13984]: W0312 12:25:33.721606 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a2c7f45_1f2d_4774_b219_4b2d34910421.slice/crio-a382c48b5f3f5031cfdad4cf964170dd6d64fc7054c4c129a1ac94ad023fc60b WatchSource:0}: Error finding container a382c48b5f3f5031cfdad4cf964170dd6d64fc7054c4c129a1ac94ad023fc60b: Status 404 returned error can't find the container with id a382c48b5f3f5031cfdad4cf964170dd6d64fc7054c4c129a1ac94ad023fc60b Mar 12 12:25:33.954277 master-0 kubenswrapper[13984]: I0312 12:25:33.954232 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fk9sm"] Mar 12 12:25:33.960629 master-0 kubenswrapper[13984]: W0312 12:25:33.960590 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6635c84_7bcc_48ce_a388_170ea85a8b4f.slice/crio-11db2f3adf20c42c28eff987f8b7799a69dd8d5121efc700b92f6b6b17f8a8b0 WatchSource:0}: Error finding container 11db2f3adf20c42c28eff987f8b7799a69dd8d5121efc700b92f6b6b17f8a8b0: Status 404 returned error can't find the container with id 11db2f3adf20c42c28eff987f8b7799a69dd8d5121efc700b92f6b6b17f8a8b0 Mar 12 12:25:34.660723 master-0 kubenswrapper[13984]: I0312 12:25:34.660623 13984 generic.go:334] "Generic (PLEG): container finished" podID="9a2c7f45-1f2d-4774-b219-4b2d34910421" containerID="dd9d88da573694b21ddc5e7ee9de670e5563bef4e354e81f398088df0763a452" exitCode=0 Mar 12 12:25:34.661010 master-0 kubenswrapper[13984]: I0312 12:25:34.660789 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjp42" event={"ID":"9a2c7f45-1f2d-4774-b219-4b2d34910421","Type":"ContainerDied","Data":"dd9d88da573694b21ddc5e7ee9de670e5563bef4e354e81f398088df0763a452"} Mar 12 12:25:34.661010 master-0 kubenswrapper[13984]: I0312 12:25:34.660834 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjp42" event={"ID":"9a2c7f45-1f2d-4774-b219-4b2d34910421","Type":"ContainerStarted","Data":"a382c48b5f3f5031cfdad4cf964170dd6d64fc7054c4c129a1ac94ad023fc60b"} Mar 12 12:25:34.662729 master-0 kubenswrapper[13984]: I0312 12:25:34.662700 13984 generic.go:334] "Generic (PLEG): container finished" podID="a6635c84-7bcc-48ce-a388-170ea85a8b4f" containerID="ae962afbc5aa1423b73b0f5ceea4e6db284a61cfe664637c338b4eefa828a340" exitCode=0 Mar 12 12:25:34.662805 master-0 kubenswrapper[13984]: I0312 12:25:34.662743 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9sm" event={"ID":"a6635c84-7bcc-48ce-a388-170ea85a8b4f","Type":"ContainerDied","Data":"ae962afbc5aa1423b73b0f5ceea4e6db284a61cfe664637c338b4eefa828a340"} Mar 12 12:25:34.662805 master-0 kubenswrapper[13984]: I0312 12:25:34.662775 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9sm" event={"ID":"a6635c84-7bcc-48ce-a388-170ea85a8b4f","Type":"ContainerStarted","Data":"11db2f3adf20c42c28eff987f8b7799a69dd8d5121efc700b92f6b6b17f8a8b0"} Mar 12 12:25:35.804602 master-0 kubenswrapper[13984]: I0312 12:25:35.804516 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml9kh"] Mar 12 12:25:35.970727 master-0 kubenswrapper[13984]: I0312 12:25:35.970680 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8s844"] Mar 12 12:25:35.972047 master-0 kubenswrapper[13984]: I0312 12:25:35.972011 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:35.984529 master-0 kubenswrapper[13984]: I0312 12:25:35.981565 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzr7t\" (UniqueName: \"kubernetes.io/projected/fa29115a-70fa-4a85-a071-66f11bd82829-kube-api-access-bzr7t\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:35.984529 master-0 kubenswrapper[13984]: I0312 12:25:35.981676 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29115a-70fa-4a85-a071-66f11bd82829-catalog-content\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:35.984529 master-0 kubenswrapper[13984]: I0312 12:25:35.981711 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29115a-70fa-4a85-a071-66f11bd82829-utilities\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.004557 master-0 kubenswrapper[13984]: I0312 12:25:35.994384 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s844"] Mar 12 12:25:36.083099 master-0 kubenswrapper[13984]: I0312 12:25:36.082970 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzr7t\" (UniqueName: \"kubernetes.io/projected/fa29115a-70fa-4a85-a071-66f11bd82829-kube-api-access-bzr7t\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.083099 master-0 kubenswrapper[13984]: I0312 12:25:36.083043 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29115a-70fa-4a85-a071-66f11bd82829-catalog-content\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.083343 master-0 kubenswrapper[13984]: I0312 12:25:36.083247 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29115a-70fa-4a85-a071-66f11bd82829-utilities\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.083597 master-0 kubenswrapper[13984]: I0312 12:25:36.083574 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa29115a-70fa-4a85-a071-66f11bd82829-catalog-content\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.083693 master-0 kubenswrapper[13984]: I0312 12:25:36.083666 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa29115a-70fa-4a85-a071-66f11bd82829-utilities\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.113219 master-0 kubenswrapper[13984]: I0312 12:25:36.113160 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzr7t\" (UniqueName: \"kubernetes.io/projected/fa29115a-70fa-4a85-a071-66f11bd82829-kube-api-access-bzr7t\") pod \"community-operators-8s844\" (UID: \"fa29115a-70fa-4a85-a071-66f11bd82829\") " pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:36.308231 master-0 kubenswrapper[13984]: I0312 12:25:36.308166 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8s844" Mar 12 12:25:37.328587 master-0 kubenswrapper[13984]: I0312 12:25:37.328528 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8s844"] Mar 12 12:25:37.817456 master-0 kubenswrapper[13984]: I0312 12:25:37.817408 13984 generic.go:334] "Generic (PLEG): container finished" podID="fa29115a-70fa-4a85-a071-66f11bd82829" containerID="1866a509b71432d0f09a98cc3e157200140a7655fda48bf3ccc3155d08b93d06" exitCode=0 Mar 12 12:25:37.817456 master-0 kubenswrapper[13984]: I0312 12:25:37.817456 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s844" event={"ID":"fa29115a-70fa-4a85-a071-66f11bd82829","Type":"ContainerDied","Data":"1866a509b71432d0f09a98cc3e157200140a7655fda48bf3ccc3155d08b93d06"} Mar 12 12:25:37.817733 master-0 kubenswrapper[13984]: I0312 12:25:37.817497 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s844" event={"ID":"fa29115a-70fa-4a85-a071-66f11bd82829","Type":"ContainerStarted","Data":"5acdd6e39656c20901b0dbba69477dceed1349d5e1f429b32597fa38d18a6912"} Mar 12 12:25:41.076158 master-0 kubenswrapper[13984]: I0312 12:25:41.076066 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: I0312 12:25:41.076219 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: E0312 12:25:41.076548 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: E0312 12:25:41.076603 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: E0312 12:25:41.076654 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: E0312 12:25:41.076688 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: E0312 12:25:41.076714 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:26:45.076683004 +0000 UTC m=+137.274698536 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:25:41.076898 master-0 kubenswrapper[13984]: E0312 12:25:41.076750 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:45.076728276 +0000 UTC m=+137.274743808 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:25:54.376784 master-0 kubenswrapper[13984]: I0312 12:25:54.376724 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dmwgl"] Mar 12 12:25:54.379109 master-0 kubenswrapper[13984]: I0312 12:25:54.379085 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.383310 master-0 kubenswrapper[13984]: I0312 12:25:54.382064 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-c479d" Mar 12 12:25:54.383310 master-0 kubenswrapper[13984]: I0312 12:25:54.382314 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 12:25:54.464783 master-0 kubenswrapper[13984]: I0312 12:25:54.464734 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c10b4652-67a3-498f-832f-77be2311a248-mcd-auth-proxy-config\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.464783 master-0 kubenswrapper[13984]: I0312 12:25:54.464786 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c10b4652-67a3-498f-832f-77be2311a248-proxy-tls\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.465060 master-0 kubenswrapper[13984]: I0312 12:25:54.464819 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c10b4652-67a3-498f-832f-77be2311a248-rootfs\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.465060 master-0 kubenswrapper[13984]: I0312 12:25:54.464863 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gb4r4\" (UniqueName: \"kubernetes.io/projected/c10b4652-67a3-498f-832f-77be2311a248-kube-api-access-gb4r4\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.566079 master-0 kubenswrapper[13984]: I0312 12:25:54.565941 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c10b4652-67a3-498f-832f-77be2311a248-mcd-auth-proxy-config\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.566079 master-0 kubenswrapper[13984]: I0312 12:25:54.565997 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c10b4652-67a3-498f-832f-77be2311a248-proxy-tls\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.566079 master-0 kubenswrapper[13984]: I0312 12:25:54.566025 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c10b4652-67a3-498f-832f-77be2311a248-rootfs\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.567129 master-0 kubenswrapper[13984]: I0312 12:25:54.566583 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gb4r4\" (UniqueName: \"kubernetes.io/projected/c10b4652-67a3-498f-832f-77be2311a248-kube-api-access-gb4r4\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.567129 master-0 kubenswrapper[13984]: I0312 12:25:54.566645 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/c10b4652-67a3-498f-832f-77be2311a248-rootfs\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.567259 master-0 kubenswrapper[13984]: I0312 12:25:54.567218 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c10b4652-67a3-498f-832f-77be2311a248-mcd-auth-proxy-config\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.571811 master-0 kubenswrapper[13984]: I0312 12:25:54.571772 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c10b4652-67a3-498f-832f-77be2311a248-proxy-tls\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.584146 master-0 kubenswrapper[13984]: I0312 12:25:54.584107 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gb4r4\" (UniqueName: \"kubernetes.io/projected/c10b4652-67a3-498f-832f-77be2311a248-kube-api-access-gb4r4\") pod \"machine-config-daemon-dmwgl\" (UID: \"c10b4652-67a3-498f-832f-77be2311a248\") " pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:54.699564 master-0 kubenswrapper[13984]: I0312 12:25:54.699499 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" Mar 12 12:25:57.830310 master-0 kubenswrapper[13984]: W0312 12:25:57.830193 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc10b4652_67a3_498f_832f_77be2311a248.slice/crio-b9656cc5cb56ad0a7b3dd70ccf10740b01d503913eefd90bef268602be444392 WatchSource:0}: Error finding container b9656cc5cb56ad0a7b3dd70ccf10740b01d503913eefd90bef268602be444392: Status 404 returned error can't find the container with id b9656cc5cb56ad0a7b3dd70ccf10740b01d503913eefd90bef268602be444392 Mar 12 12:25:58.050035 master-0 kubenswrapper[13984]: I0312 12:25:58.049991 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" event={"ID":"c10b4652-67a3-498f-832f-77be2311a248","Type":"ContainerStarted","Data":"b9656cc5cb56ad0a7b3dd70ccf10740b01d503913eefd90bef268602be444392"} Mar 12 12:25:59.058910 master-0 kubenswrapper[13984]: I0312 12:25:59.058847 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" event={"ID":"c10b4652-67a3-498f-832f-77be2311a248","Type":"ContainerStarted","Data":"4556fe70e39a7df42eb653ac6808f0250494aa968c58406db0b980771e281604"} Mar 12 12:25:59.058910 master-0 kubenswrapper[13984]: I0312 12:25:59.058902 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" event={"ID":"c10b4652-67a3-498f-832f-77be2311a248","Type":"ContainerStarted","Data":"da15c36f88e02c7d8a5e5311d807bf7cf5454a3f49617f46e33650be9a505927"} Mar 12 12:25:59.062423 master-0 kubenswrapper[13984]: I0312 12:25:59.062379 13984 generic.go:334] "Generic (PLEG): container finished" podID="fa29115a-70fa-4a85-a071-66f11bd82829" containerID="4f3994402510ea4eaf52d8353e34d99e6a423b0aaa46897e54bf799717363807" exitCode=0 Mar 12 12:25:59.062518 master-0 kubenswrapper[13984]: I0312 12:25:59.062497 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s844" event={"ID":"fa29115a-70fa-4a85-a071-66f11bd82829","Type":"ContainerDied","Data":"4f3994402510ea4eaf52d8353e34d99e6a423b0aaa46897e54bf799717363807"} Mar 12 12:25:59.073537 master-0 kubenswrapper[13984]: I0312 12:25:59.069463 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9sm" event={"ID":"a6635c84-7bcc-48ce-a388-170ea85a8b4f","Type":"ContainerStarted","Data":"df2ee40e1cf15c2921e28d11299f256ce083e236404c03aaf2f64864c6d74359"} Mar 12 12:25:59.077244 master-0 kubenswrapper[13984]: I0312 12:25:59.077195 13984 generic.go:334] "Generic (PLEG): container finished" podID="23004971-d185-418c-b0d9-5c0891456288" containerID="5dbaf06b74e75ebd14d1bb4557628628777ff6d32124639c6ce829be319d8488" exitCode=0 Mar 12 12:25:59.077353 master-0 kubenswrapper[13984]: I0312 12:25:59.077258 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrlw" event={"ID":"23004971-d185-418c-b0d9-5c0891456288","Type":"ContainerDied","Data":"5dbaf06b74e75ebd14d1bb4557628628777ff6d32124639c6ce829be319d8488"} Mar 12 12:25:59.086799 master-0 kubenswrapper[13984]: I0312 12:25:59.085600 13984 generic.go:334] "Generic (PLEG): container finished" podID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerID="a4d90516edb9a8ebf48a5fbd5067f2ddf2e3f750afc58d96e15c7a940d8c6ffd" exitCode=0 Mar 12 12:25:59.086799 master-0 kubenswrapper[13984]: I0312 12:25:59.085704 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9kh" event={"ID":"ff8a2e30-c594-48cb-819d-38e71fa5dd3f","Type":"ContainerDied","Data":"a4d90516edb9a8ebf48a5fbd5067f2ddf2e3f750afc58d96e15c7a940d8c6ffd"} Mar 12 12:25:59.088896 master-0 kubenswrapper[13984]: I0312 12:25:59.088856 13984 generic.go:334] "Generic (PLEG): container finished" podID="9a2c7f45-1f2d-4774-b219-4b2d34910421" containerID="36399f0020d09373aded18fc6d8ee14e3e40fa6723a94a7c98814c211f30d6b0" exitCode=0 Mar 12 12:25:59.088954 master-0 kubenswrapper[13984]: I0312 12:25:59.088915 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjp42" event={"ID":"9a2c7f45-1f2d-4774-b219-4b2d34910421","Type":"ContainerDied","Data":"36399f0020d09373aded18fc6d8ee14e3e40fa6723a94a7c98814c211f30d6b0"} Mar 12 12:25:59.351339 master-0 kubenswrapper[13984]: I0312 12:25:59.351233 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:25:59.445006 master-0 kubenswrapper[13984]: I0312 12:25:59.444963 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-catalog-content\") pod \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " Mar 12 12:25:59.445222 master-0 kubenswrapper[13984]: I0312 12:25:59.445055 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8x4v\" (UniqueName: \"kubernetes.io/projected/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-kube-api-access-d8x4v\") pod \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " Mar 12 12:25:59.445222 master-0 kubenswrapper[13984]: I0312 12:25:59.445116 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-utilities\") pod \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\" (UID: \"ff8a2e30-c594-48cb-819d-38e71fa5dd3f\") " Mar 12 12:25:59.446575 master-0 kubenswrapper[13984]: I0312 12:25:59.446535 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-utilities" (OuterVolumeSpecName: "utilities") pod "ff8a2e30-c594-48cb-819d-38e71fa5dd3f" (UID: "ff8a2e30-c594-48cb-819d-38e71fa5dd3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:25:59.448912 master-0 kubenswrapper[13984]: I0312 12:25:59.448834 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-kube-api-access-d8x4v" (OuterVolumeSpecName: "kube-api-access-d8x4v") pod "ff8a2e30-c594-48cb-819d-38e71fa5dd3f" (UID: "ff8a2e30-c594-48cb-819d-38e71fa5dd3f"). InnerVolumeSpecName "kube-api-access-d8x4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:25:59.547401 master-0 kubenswrapper[13984]: I0312 12:25:59.547336 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8x4v\" (UniqueName: \"kubernetes.io/projected/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-kube-api-access-d8x4v\") on node \"master-0\" DevicePath \"\"" Mar 12 12:25:59.547401 master-0 kubenswrapper[13984]: I0312 12:25:59.547393 13984 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-utilities\") on node \"master-0\" DevicePath \"\"" Mar 12 12:25:59.579427 master-0 kubenswrapper[13984]: I0312 12:25:59.579339 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dmwgl" podStartSLOduration=5.579312364 podStartE2EDuration="5.579312364s" podCreationTimestamp="2026-03-12 12:25:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:25:59.576661132 +0000 UTC m=+91.774676694" watchObservedRunningTime="2026-03-12 12:25:59.579312364 +0000 UTC m=+91.777327896" Mar 12 12:26:00.102822 master-0 kubenswrapper[13984]: I0312 12:26:00.102762 13984 generic.go:334] "Generic (PLEG): container finished" podID="a6635c84-7bcc-48ce-a388-170ea85a8b4f" containerID="df2ee40e1cf15c2921e28d11299f256ce083e236404c03aaf2f64864c6d74359" exitCode=0 Mar 12 12:26:00.103424 master-0 kubenswrapper[13984]: I0312 12:26:00.102851 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9sm" event={"ID":"a6635c84-7bcc-48ce-a388-170ea85a8b4f","Type":"ContainerDied","Data":"df2ee40e1cf15c2921e28d11299f256ce083e236404c03aaf2f64864c6d74359"} Mar 12 12:26:00.106182 master-0 kubenswrapper[13984]: I0312 12:26:00.106147 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ml9kh" Mar 12 12:26:00.106243 master-0 kubenswrapper[13984]: I0312 12:26:00.106197 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ml9kh" event={"ID":"ff8a2e30-c594-48cb-819d-38e71fa5dd3f","Type":"ContainerDied","Data":"a9b06ab179a724c70ad0609aa0ab4a3a63e845c628bf0079acd7f222760c4fb7"} Mar 12 12:26:00.106315 master-0 kubenswrapper[13984]: I0312 12:26:00.106279 13984 scope.go:117] "RemoveContainer" containerID="a4d90516edb9a8ebf48a5fbd5067f2ddf2e3f750afc58d96e15c7a940d8c6ffd" Mar 12 12:26:00.126267 master-0 kubenswrapper[13984]: I0312 12:26:00.126228 13984 scope.go:117] "RemoveContainer" containerID="9ac4d7e055c899d286d147117c59aa4d515ecdc1c9e6a8705a4e7a7d3181cb44" Mar 12 12:26:00.165362 master-0 kubenswrapper[13984]: I0312 12:26:00.165261 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:26:00.165650 master-0 kubenswrapper[13984]: E0312 12:26:00.165584 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:27:04.165545391 +0000 UTC m=+156.363560923 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:00.640468 master-0 kubenswrapper[13984]: I0312 12:26:00.640358 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ff8a2e30-c594-48cb-819d-38e71fa5dd3f" (UID: "ff8a2e30-c594-48cb-819d-38e71fa5dd3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:26:00.674665 master-0 kubenswrapper[13984]: I0312 12:26:00.674607 13984 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ff8a2e30-c594-48cb-819d-38e71fa5dd3f-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 12 12:26:02.305099 master-0 kubenswrapper[13984]: I0312 12:26:02.305043 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ml9kh"] Mar 12 12:26:02.685748 master-0 kubenswrapper[13984]: I0312 12:26:02.685639 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ml9kh"] Mar 12 12:26:03.987999 master-0 kubenswrapper[13984]: I0312 12:26:03.987951 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" path="/var/lib/kubelet/pods/ff8a2e30-c594-48cb-819d-38e71fa5dd3f/volumes" Mar 12 12:26:04.278681 master-0 kubenswrapper[13984]: I0312 12:26:04.278501 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs"] Mar 12 12:26:04.278890 master-0 kubenswrapper[13984]: E0312 12:26:04.278862 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerName="extract-utilities" Mar 12 12:26:04.278958 master-0 kubenswrapper[13984]: I0312 12:26:04.278919 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerName="extract-utilities" Mar 12 12:26:04.278958 master-0 kubenswrapper[13984]: E0312 12:26:04.278942 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerName="extract-content" Mar 12 12:26:04.278958 master-0 kubenswrapper[13984]: I0312 12:26:04.278948 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerName="extract-content" Mar 12 12:26:04.279064 master-0 kubenswrapper[13984]: I0312 12:26:04.279049 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8a2e30-c594-48cb-819d-38e71fa5dd3f" containerName="extract-content" Mar 12 12:26:04.279700 master-0 kubenswrapper[13984]: I0312 12:26:04.279676 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.284710 master-0 kubenswrapper[13984]: I0312 12:26:04.283656 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs"] Mar 12 12:26:04.284710 master-0 kubenswrapper[13984]: I0312 12:26:04.283662 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 12:26:04.289872 master-0 kubenswrapper[13984]: I0312 12:26:04.288882 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-hsrzl" Mar 12 12:26:04.339168 master-0 kubenswrapper[13984]: I0312 12:26:04.339115 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ec202b2-e051-466b-a3fb-c054db0a9b16-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.339435 master-0 kubenswrapper[13984]: I0312 12:26:04.339202 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zszmx\" (UniqueName: \"kubernetes.io/projected/9ec202b2-e051-466b-a3fb-c054db0a9b16-kube-api-access-zszmx\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.339435 master-0 kubenswrapper[13984]: I0312 12:26:04.339294 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec202b2-e051-466b-a3fb-c054db0a9b16-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.448136 master-0 kubenswrapper[13984]: I0312 12:26:04.440282 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zszmx\" (UniqueName: \"kubernetes.io/projected/9ec202b2-e051-466b-a3fb-c054db0a9b16-kube-api-access-zszmx\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.448136 master-0 kubenswrapper[13984]: I0312 12:26:04.440400 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec202b2-e051-466b-a3fb-c054db0a9b16-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.448136 master-0 kubenswrapper[13984]: I0312 12:26:04.440458 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ec202b2-e051-466b-a3fb-c054db0a9b16-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.448136 master-0 kubenswrapper[13984]: I0312 12:26:04.441507 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9ec202b2-e051-466b-a3fb-c054db0a9b16-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.448136 master-0 kubenswrapper[13984]: I0312 12:26:04.446055 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ec202b2-e051-466b-a3fb-c054db0a9b16-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.457391 master-0 kubenswrapper[13984]: I0312 12:26:04.456866 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zszmx\" (UniqueName: \"kubernetes.io/projected/9ec202b2-e051-466b-a3fb-c054db0a9b16-kube-api-access-zszmx\") pod \"machine-config-controller-ff46b7bdf-hdhhs\" (UID: \"9ec202b2-e051-466b-a3fb-c054db0a9b16\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:04.619186 master-0 kubenswrapper[13984]: I0312 12:26:04.619005 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" Mar 12 12:26:05.105848 master-0 kubenswrapper[13984]: I0312 12:26:05.105768 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs"] Mar 12 12:26:05.116787 master-0 kubenswrapper[13984]: W0312 12:26:05.116731 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ec202b2_e051_466b_a3fb_c054db0a9b16.slice/crio-eedc7a4db7048045c14c389ddbe1375bd1c812635f8b3811e127bf98fe20f225 WatchSource:0}: Error finding container eedc7a4db7048045c14c389ddbe1375bd1c812635f8b3811e127bf98fe20f225: Status 404 returned error can't find the container with id eedc7a4db7048045c14c389ddbe1375bd1c812635f8b3811e127bf98fe20f225 Mar 12 12:26:05.157875 master-0 kubenswrapper[13984]: I0312 12:26:05.157819 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" event={"ID":"9ec202b2-e051-466b-a3fb-c054db0a9b16","Type":"ContainerStarted","Data":"eedc7a4db7048045c14c389ddbe1375bd1c812635f8b3811e127bf98fe20f225"} Mar 12 12:26:06.164003 master-0 kubenswrapper[13984]: I0312 12:26:06.163836 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" event={"ID":"9ec202b2-e051-466b-a3fb-c054db0a9b16","Type":"ContainerStarted","Data":"1df1f496be3b4e1e8b0655b191e9ae344f113db45974d4edcfe571cf0c573c1d"} Mar 12 12:26:06.650493 master-0 kubenswrapper[13984]: I0312 12:26:06.650382 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc"] Mar 12 12:26:06.652886 master-0 kubenswrapper[13984]: I0312 12:26:06.652854 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" Mar 12 12:26:06.654450 master-0 kubenswrapper[13984]: I0312 12:26:06.654420 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6"] Mar 12 12:26:06.655026 master-0 kubenswrapper[13984]: I0312 12:26:06.654990 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:06.657284 master-0 kubenswrapper[13984]: I0312 12:26:06.657253 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 12 12:26:06.658053 master-0 kubenswrapper[13984]: I0312 12:26:06.658001 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-pczct"] Mar 12 12:26:06.659147 master-0 kubenswrapper[13984]: I0312 12:26:06.659106 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.663103 master-0 kubenswrapper[13984]: I0312 12:26:06.663064 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 12:26:06.663356 master-0 kubenswrapper[13984]: I0312 12:26:06.663329 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 12:26:06.663521 master-0 kubenswrapper[13984]: I0312 12:26:06.663501 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 12:26:06.663672 master-0 kubenswrapper[13984]: I0312 12:26:06.663646 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 12:26:06.663704 master-0 kubenswrapper[13984]: I0312 12:26:06.663684 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 12:26:06.663753 master-0 kubenswrapper[13984]: I0312 12:26:06.663731 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 12:26:06.677447 master-0 kubenswrapper[13984]: I0312 12:26:06.677399 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-default-certificate\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.677567 master-0 kubenswrapper[13984]: I0312 12:26:06.677455 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1aba020f-dfec-455d-83d7-728145cc4114-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-c6pf6\" (UID: \"1aba020f-dfec-455d-83d7-728145cc4114\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:06.677567 master-0 kubenswrapper[13984]: I0312 12:26:06.677545 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7s46\" (UniqueName: \"kubernetes.io/projected/f7e806c0-793c-42ba-91cb-12d64ba841d3-kube-api-access-j7s46\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.677649 master-0 kubenswrapper[13984]: I0312 12:26:06.677583 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-stats-auth\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.677683 master-0 kubenswrapper[13984]: I0312 12:26:06.677651 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7e806c0-793c-42ba-91cb-12d64ba841d3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.677719 master-0 kubenswrapper[13984]: I0312 12:26:06.677693 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-metrics-certs\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.778943 master-0 kubenswrapper[13984]: I0312 12:26:06.778883 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7s46\" (UniqueName: \"kubernetes.io/projected/f7e806c0-793c-42ba-91cb-12d64ba841d3-kube-api-access-j7s46\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.779250 master-0 kubenswrapper[13984]: I0312 12:26:06.779181 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-stats-auth\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.779369 master-0 kubenswrapper[13984]: I0312 12:26:06.779345 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7e806c0-793c-42ba-91cb-12d64ba841d3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.779505 master-0 kubenswrapper[13984]: I0312 12:26:06.779458 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-metrics-certs\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.779576 master-0 kubenswrapper[13984]: I0312 12:26:06.779563 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-default-certificate\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.779623 master-0 kubenswrapper[13984]: I0312 12:26:06.779594 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1aba020f-dfec-455d-83d7-728145cc4114-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-c6pf6\" (UID: \"1aba020f-dfec-455d-83d7-728145cc4114\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:06.779671 master-0 kubenswrapper[13984]: I0312 12:26:06.779628 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26v88\" (UniqueName: \"kubernetes.io/projected/46328b96-3d2c-4be0-ad3e-5e0d816173c7-kube-api-access-26v88\") pod \"network-check-source-7c67b67d47-7bdwc\" (UID: \"46328b96-3d2c-4be0-ad3e-5e0d816173c7\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" Mar 12 12:26:06.780701 master-0 kubenswrapper[13984]: I0312 12:26:06.780652 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7e806c0-793c-42ba-91cb-12d64ba841d3-service-ca-bundle\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.787105 master-0 kubenswrapper[13984]: I0312 12:26:06.782530 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-stats-auth\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.787105 master-0 kubenswrapper[13984]: I0312 12:26:06.783821 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-metrics-certs\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.787105 master-0 kubenswrapper[13984]: I0312 12:26:06.784035 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f7e806c0-793c-42ba-91cb-12d64ba841d3-default-certificate\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.787105 master-0 kubenswrapper[13984]: I0312 12:26:06.785093 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/1aba020f-dfec-455d-83d7-728145cc4114-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-c6pf6\" (UID: \"1aba020f-dfec-455d-83d7-728145cc4114\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:06.881033 master-0 kubenswrapper[13984]: I0312 12:26:06.880967 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26v88\" (UniqueName: \"kubernetes.io/projected/46328b96-3d2c-4be0-ad3e-5e0d816173c7-kube-api-access-26v88\") pod \"network-check-source-7c67b67d47-7bdwc\" (UID: \"46328b96-3d2c-4be0-ad3e-5e0d816173c7\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" Mar 12 12:26:06.990167 master-0 kubenswrapper[13984]: I0312 12:26:06.989117 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc"] Mar 12 12:26:06.990167 master-0 kubenswrapper[13984]: I0312 12:26:06.989154 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6"] Mar 12 12:26:06.996823 master-0 kubenswrapper[13984]: I0312 12:26:06.996772 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7s46\" (UniqueName: \"kubernetes.io/projected/f7e806c0-793c-42ba-91cb-12d64ba841d3-kube-api-access-j7s46\") pod \"router-default-79f8cd6fdd-pczct\" (UID: \"f7e806c0-793c-42ba-91cb-12d64ba841d3\") " pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:06.997382 master-0 kubenswrapper[13984]: I0312 12:26:06.997340 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26v88\" (UniqueName: \"kubernetes.io/projected/46328b96-3d2c-4be0-ad3e-5e0d816173c7-kube-api-access-26v88\") pod \"network-check-source-7c67b67d47-7bdwc\" (UID: \"46328b96-3d2c-4be0-ad3e-5e0d816173c7\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" Mar 12 12:26:06.998923 master-0 kubenswrapper[13984]: I0312 12:26:06.998883 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:07.014899 master-0 kubenswrapper[13984]: I0312 12:26:07.014860 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:07.278403 master-0 kubenswrapper[13984]: I0312 12:26:07.277975 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" Mar 12 12:26:07.495887 master-0 kubenswrapper[13984]: I0312 12:26:07.492980 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vb6qv"] Mar 12 12:26:07.495887 master-0 kubenswrapper[13984]: I0312 12:26:07.494860 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:07.497272 master-0 kubenswrapper[13984]: I0312 12:26:07.497244 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 12:26:07.497468 master-0 kubenswrapper[13984]: I0312 12:26:07.497452 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 12:26:07.497732 master-0 kubenswrapper[13984]: I0312 12:26:07.497713 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-6rn7c" Mar 12 12:26:07.497938 master-0 kubenswrapper[13984]: I0312 12:26:07.497918 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 12:26:07.499277 master-0 kubenswrapper[13984]: I0312 12:26:07.499242 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7g47\" (UniqueName: \"kubernetes.io/projected/bd171751-5c5c-458f-8e24-4ffde0aa2501-kube-api-access-v7g47\") pod \"ingress-canary-vb6qv\" (UID: \"bd171751-5c5c-458f-8e24-4ffde0aa2501\") " pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:07.499508 master-0 kubenswrapper[13984]: I0312 12:26:07.499462 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd171751-5c5c-458f-8e24-4ffde0aa2501-cert\") pod \"ingress-canary-vb6qv\" (UID: \"bd171751-5c5c-458f-8e24-4ffde0aa2501\") " pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:07.600744 master-0 kubenswrapper[13984]: I0312 12:26:07.600556 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7g47\" (UniqueName: \"kubernetes.io/projected/bd171751-5c5c-458f-8e24-4ffde0aa2501-kube-api-access-v7g47\") pod \"ingress-canary-vb6qv\" (UID: \"bd171751-5c5c-458f-8e24-4ffde0aa2501\") " pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:07.600744 master-0 kubenswrapper[13984]: I0312 12:26:07.600646 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd171751-5c5c-458f-8e24-4ffde0aa2501-cert\") pod \"ingress-canary-vb6qv\" (UID: \"bd171751-5c5c-458f-8e24-4ffde0aa2501\") " pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:07.604240 master-0 kubenswrapper[13984]: I0312 12:26:07.604203 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd171751-5c5c-458f-8e24-4ffde0aa2501-cert\") pod \"ingress-canary-vb6qv\" (UID: \"bd171751-5c5c-458f-8e24-4ffde0aa2501\") " pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:07.822985 master-0 kubenswrapper[13984]: I0312 12:26:07.821631 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vb6qv"] Mar 12 12:26:07.826600 master-0 kubenswrapper[13984]: I0312 12:26:07.826378 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7g47\" (UniqueName: \"kubernetes.io/projected/bd171751-5c5c-458f-8e24-4ffde0aa2501-kube-api-access-v7g47\") pod \"ingress-canary-vb6qv\" (UID: \"bd171751-5c5c-458f-8e24-4ffde0aa2501\") " pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:08.112621 master-0 kubenswrapper[13984]: I0312 12:26:08.112556 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vb6qv" Mar 12 12:26:08.875557 master-0 kubenswrapper[13984]: W0312 12:26:08.874278 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e806c0_793c_42ba_91cb_12d64ba841d3.slice/crio-ab115f48d3260e40c6e72658777d46b45eca001fe620b3e80b95b2777a59ff66 WatchSource:0}: Error finding container ab115f48d3260e40c6e72658777d46b45eca001fe620b3e80b95b2777a59ff66: Status 404 returned error can't find the container with id ab115f48d3260e40c6e72658777d46b45eca001fe620b3e80b95b2777a59ff66 Mar 12 12:26:09.182443 master-0 kubenswrapper[13984]: I0312 12:26:09.182391 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" event={"ID":"f7e806c0-793c-42ba-91cb-12d64ba841d3","Type":"ContainerStarted","Data":"ab115f48d3260e40c6e72658777d46b45eca001fe620b3e80b95b2777a59ff66"} Mar 12 12:26:09.957229 master-0 kubenswrapper[13984]: I0312 12:26:09.957103 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vb6qv"] Mar 12 12:26:09.963769 master-0 kubenswrapper[13984]: I0312 12:26:09.960998 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc"] Mar 12 12:26:09.966886 master-0 kubenswrapper[13984]: W0312 12:26:09.966795 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd171751_5c5c_458f_8e24_4ffde0aa2501.slice/crio-d417aac46b9225bc996df29566dbc799851bf67d7de74035c10705e9f1815b76 WatchSource:0}: Error finding container d417aac46b9225bc996df29566dbc799851bf67d7de74035c10705e9f1815b76: Status 404 returned error can't find the container with id d417aac46b9225bc996df29566dbc799851bf67d7de74035c10705e9f1815b76 Mar 12 12:26:09.970346 master-0 kubenswrapper[13984]: W0312 12:26:09.970315 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1aba020f_dfec_455d_83d7_728145cc4114.slice/crio-9ea5c79e0120258d88cb62fab88d44cd8d87d535599490b6d633325d6b34ab4c WatchSource:0}: Error finding container 9ea5c79e0120258d88cb62fab88d44cd8d87d535599490b6d633325d6b34ab4c: Status 404 returned error can't find the container with id 9ea5c79e0120258d88cb62fab88d44cd8d87d535599490b6d633325d6b34ab4c Mar 12 12:26:09.970704 master-0 kubenswrapper[13984]: W0312 12:26:09.970663 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46328b96_3d2c_4be0_ad3e_5e0d816173c7.slice/crio-a798d46e4be66a69243854743908ec92a69e96a1edf3658e473766d692773079 WatchSource:0}: Error finding container a798d46e4be66a69243854743908ec92a69e96a1edf3658e473766d692773079: Status 404 returned error can't find the container with id a798d46e4be66a69243854743908ec92a69e96a1edf3658e473766d692773079 Mar 12 12:26:09.972014 master-0 kubenswrapper[13984]: I0312 12:26:09.971975 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6"] Mar 12 12:26:10.190781 master-0 kubenswrapper[13984]: I0312 12:26:10.190696 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" event={"ID":"1aba020f-dfec-455d-83d7-728145cc4114","Type":"ContainerStarted","Data":"9ea5c79e0120258d88cb62fab88d44cd8d87d535599490b6d633325d6b34ab4c"} Mar 12 12:26:10.192359 master-0 kubenswrapper[13984]: I0312 12:26:10.192219 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vb6qv" event={"ID":"bd171751-5c5c-458f-8e24-4ffde0aa2501","Type":"ContainerStarted","Data":"d417aac46b9225bc996df29566dbc799851bf67d7de74035c10705e9f1815b76"} Mar 12 12:26:10.194715 master-0 kubenswrapper[13984]: I0312 12:26:10.194652 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" event={"ID":"9ec202b2-e051-466b-a3fb-c054db0a9b16","Type":"ContainerStarted","Data":"26efc55fcbf07cf19ead6846cac4e3aa9840034475b51183f1207c0642300bf4"} Mar 12 12:26:10.196385 master-0 kubenswrapper[13984]: I0312 12:26:10.196330 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" event={"ID":"46328b96-3d2c-4be0-ad3e-5e0d816173c7","Type":"ContainerStarted","Data":"a798d46e4be66a69243854743908ec92a69e96a1edf3658e473766d692773079"} Mar 12 12:26:10.401037 master-0 kubenswrapper[13984]: I0312 12:26:10.400825 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" podStartSLOduration=6.400805217 podStartE2EDuration="6.400805217s" podCreationTimestamp="2026-03-12 12:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:26:10.399201654 +0000 UTC m=+102.597217166" watchObservedRunningTime="2026-03-12 12:26:10.400805217 +0000 UTC m=+102.598820719" Mar 12 12:26:11.211055 master-0 kubenswrapper[13984]: I0312 12:26:11.210999 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kjp42" event={"ID":"9a2c7f45-1f2d-4774-b219-4b2d34910421","Type":"ContainerStarted","Data":"c4ed7b611bc3fcab080d81754b0ffdc3bf7ccc6e352a1173cea33b51f2881bf6"} Mar 12 12:26:11.214874 master-0 kubenswrapper[13984]: I0312 12:26:11.214833 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vb6qv" event={"ID":"bd171751-5c5c-458f-8e24-4ffde0aa2501","Type":"ContainerStarted","Data":"80efd31bf7e9db050de497d9dc0c4c71d158babec1eae31c89bf552f2aa10b50"} Mar 12 12:26:11.216467 master-0 kubenswrapper[13984]: I0312 12:26:11.216433 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" event={"ID":"46328b96-3d2c-4be0-ad3e-5e0d816173c7","Type":"ContainerStarted","Data":"6e202a7e6aa7df5db7e797caef7a4143b65bb892f92959f2eaa5a811774ef7dc"} Mar 12 12:26:11.230786 master-0 kubenswrapper[13984]: I0312 12:26:11.218824 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8s844" event={"ID":"fa29115a-70fa-4a85-a071-66f11bd82829","Type":"ContainerStarted","Data":"035fbf4deb5a6b63c3deac5dc4dbed3ab8fa7eeb82775183a933983d8bd0987c"} Mar 12 12:26:11.230786 master-0 kubenswrapper[13984]: I0312 12:26:11.221860 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fk9sm" event={"ID":"a6635c84-7bcc-48ce-a388-170ea85a8b4f","Type":"ContainerStarted","Data":"060be9a22591ed6c089d208fbb2629e461607864fd6cc335729366b81b837ce4"} Mar 12 12:26:11.231871 master-0 kubenswrapper[13984]: I0312 12:26:11.231619 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mjrlw" event={"ID":"23004971-d185-418c-b0d9-5c0891456288","Type":"ContainerStarted","Data":"a441e67b221f9df6a67fb800f3ed69d0085f0ee63e383b278fa8f7446549fb62"} Mar 12 12:26:11.239709 master-0 kubenswrapper[13984]: I0312 12:26:11.239336 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kjp42" podStartSLOduration=3.197694254 podStartE2EDuration="39.239318864s" podCreationTimestamp="2026-03-12 12:25:32 +0000 UTC" firstStartedPulling="2026-03-12 12:25:34.66352277 +0000 UTC m=+66.861538312" lastFinishedPulling="2026-03-12 12:26:10.70514743 +0000 UTC m=+102.903162922" observedRunningTime="2026-03-12 12:26:11.237892555 +0000 UTC m=+103.435908057" watchObservedRunningTime="2026-03-12 12:26:11.239318864 +0000 UTC m=+103.437334356" Mar 12 12:26:11.256767 master-0 kubenswrapper[13984]: I0312 12:26:11.256694 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8s844" podStartSLOduration=3.493675253 podStartE2EDuration="36.256676235s" podCreationTimestamp="2026-03-12 12:25:35 +0000 UTC" firstStartedPulling="2026-03-12 12:25:37.818759668 +0000 UTC m=+70.016775160" lastFinishedPulling="2026-03-12 12:26:10.58176065 +0000 UTC m=+102.779776142" observedRunningTime="2026-03-12 12:26:11.254086225 +0000 UTC m=+103.452101727" watchObservedRunningTime="2026-03-12 12:26:11.256676235 +0000 UTC m=+103.454691727" Mar 12 12:26:11.281602 master-0 kubenswrapper[13984]: I0312 12:26:11.277400 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-7bdwc" podStartSLOduration=339.277379967 podStartE2EDuration="5m39.277379967s" podCreationTimestamp="2026-03-12 12:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:26:11.274387876 +0000 UTC m=+103.472403358" watchObservedRunningTime="2026-03-12 12:26:11.277379967 +0000 UTC m=+103.475395459" Mar 12 12:26:11.298573 master-0 kubenswrapper[13984]: I0312 12:26:11.298474 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vb6qv" podStartSLOduration=5.298446839 podStartE2EDuration="5.298446839s" podCreationTimestamp="2026-03-12 12:26:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:26:11.295043937 +0000 UTC m=+103.493059439" watchObservedRunningTime="2026-03-12 12:26:11.298446839 +0000 UTC m=+103.496462331" Mar 12 12:26:11.398500 master-0 kubenswrapper[13984]: I0312 12:26:11.395835 13984 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 12 12:26:11.417405 master-0 kubenswrapper[13984]: I0312 12:26:11.415670 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fk9sm" podStartSLOduration=2.620935244 podStartE2EDuration="38.415648291s" podCreationTimestamp="2026-03-12 12:25:33 +0000 UTC" firstStartedPulling="2026-03-12 12:25:34.666455549 +0000 UTC m=+66.864471081" lastFinishedPulling="2026-03-12 12:26:10.461168626 +0000 UTC m=+102.659184128" observedRunningTime="2026-03-12 12:26:11.379903781 +0000 UTC m=+103.577919273" watchObservedRunningTime="2026-03-12 12:26:11.415648291 +0000 UTC m=+103.613663793" Mar 12 12:26:13.297047 master-0 kubenswrapper[13984]: I0312 12:26:13.296969 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:26:13.297047 master-0 kubenswrapper[13984]: I0312 12:26:13.297045 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:26:13.357634 master-0 kubenswrapper[13984]: I0312 12:26:13.357586 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:26:13.380749 master-0 kubenswrapper[13984]: I0312 12:26:13.380664 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mjrlw" podStartSLOduration=5.521690541 podStartE2EDuration="43.380633563s" podCreationTimestamp="2026-03-12 12:25:30 +0000 UTC" firstStartedPulling="2026-03-12 12:25:32.64183637 +0000 UTC m=+64.839851872" lastFinishedPulling="2026-03-12 12:26:10.500779392 +0000 UTC m=+102.698794894" observedRunningTime="2026-03-12 12:26:11.416562726 +0000 UTC m=+103.614578228" watchObservedRunningTime="2026-03-12 12:26:13.380633563 +0000 UTC m=+105.578649085" Mar 12 12:26:13.553575 master-0 kubenswrapper[13984]: I0312 12:26:13.552509 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:26:13.553575 master-0 kubenswrapper[13984]: I0312 12:26:13.552571 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:26:13.748741 master-0 kubenswrapper[13984]: I0312 12:26:13.748467 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-lkbqq"] Mar 12 12:26:13.750131 master-0 kubenswrapper[13984]: I0312 12:26:13.750094 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:13.752146 master-0 kubenswrapper[13984]: I0312 12:26:13.751964 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 12:26:13.752146 master-0 kubenswrapper[13984]: I0312 12:26:13.752020 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-cmdtw" Mar 12 12:26:13.752146 master-0 kubenswrapper[13984]: I0312 12:26:13.752068 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 12:26:13.903384 master-0 kubenswrapper[13984]: I0312 12:26:13.903293 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-certs\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:13.903546 master-0 kubenswrapper[13984]: I0312 12:26:13.903424 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-node-bootstrap-token\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:13.903546 master-0 kubenswrapper[13984]: I0312 12:26:13.903533 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv69r\" (UniqueName: \"kubernetes.io/projected/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-kube-api-access-mv69r\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.005123 master-0 kubenswrapper[13984]: I0312 12:26:14.005070 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mv69r\" (UniqueName: \"kubernetes.io/projected/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-kube-api-access-mv69r\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.005340 master-0 kubenswrapper[13984]: I0312 12:26:14.005230 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-certs\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.005340 master-0 kubenswrapper[13984]: I0312 12:26:14.005256 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-node-bootstrap-token\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.010177 master-0 kubenswrapper[13984]: I0312 12:26:14.010139 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-certs\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.010418 master-0 kubenswrapper[13984]: I0312 12:26:14.010377 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-node-bootstrap-token\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.025756 master-0 kubenswrapper[13984]: I0312 12:26:14.025719 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv69r\" (UniqueName: \"kubernetes.io/projected/8dd52d53-f9f1-4575-b6cb-6eecd7687e73-kube-api-access-mv69r\") pod \"machine-config-server-lkbqq\" (UID: \"8dd52d53-f9f1-4575-b6cb-6eecd7687e73\") " pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.081719 master-0 kubenswrapper[13984]: I0312 12:26:14.081579 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-lkbqq" Mar 12 12:26:14.097386 master-0 kubenswrapper[13984]: W0312 12:26:14.097336 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dd52d53_f9f1_4575_b6cb_6eecd7687e73.slice/crio-65efc105c2540178e7983428d6cf4ea4c58f07e2b9088fe11e0f2c9b2adf7d93 WatchSource:0}: Error finding container 65efc105c2540178e7983428d6cf4ea4c58f07e2b9088fe11e0f2c9b2adf7d93: Status 404 returned error can't find the container with id 65efc105c2540178e7983428d6cf4ea4c58f07e2b9088fe11e0f2c9b2adf7d93 Mar 12 12:26:14.250760 master-0 kubenswrapper[13984]: I0312 12:26:14.250702 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" event={"ID":"f7e806c0-793c-42ba-91cb-12d64ba841d3","Type":"ContainerStarted","Data":"784fa4f20750b0f92073b80b8c3069e6c333fe90ac1e271b96dac0611c1eaa93"} Mar 12 12:26:14.251708 master-0 kubenswrapper[13984]: I0312 12:26:14.251668 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lkbqq" event={"ID":"8dd52d53-f9f1-4575-b6cb-6eecd7687e73","Type":"ContainerStarted","Data":"d197c0e5324d562c0d915c0399d89fe293a12b0ba0890e7355eb11449b539fa6"} Mar 12 12:26:14.251765 master-0 kubenswrapper[13984]: I0312 12:26:14.251710 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-lkbqq" event={"ID":"8dd52d53-f9f1-4575-b6cb-6eecd7687e73","Type":"ContainerStarted","Data":"65efc105c2540178e7983428d6cf4ea4c58f07e2b9088fe11e0f2c9b2adf7d93"} Mar 12 12:26:14.252724 master-0 kubenswrapper[13984]: I0312 12:26:14.252691 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" event={"ID":"1aba020f-dfec-455d-83d7-728145cc4114","Type":"ContainerStarted","Data":"09bddf619dc14c14ebba32bc60889953646e211e76d7ca33d9b01104adf59668"} Mar 12 12:26:14.279364 master-0 kubenswrapper[13984]: I0312 12:26:14.279280 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" podStartSLOduration=247.384756642 podStartE2EDuration="4m12.279256751s" podCreationTimestamp="2026-03-12 12:22:02 +0000 UTC" firstStartedPulling="2026-03-12 12:26:08.878392503 +0000 UTC m=+101.076408035" lastFinishedPulling="2026-03-12 12:26:13.772892632 +0000 UTC m=+105.970908144" observedRunningTime="2026-03-12 12:26:14.278525251 +0000 UTC m=+106.476540743" watchObservedRunningTime="2026-03-12 12:26:14.279256751 +0000 UTC m=+106.477272253" Mar 12 12:26:14.298788 master-0 kubenswrapper[13984]: I0312 12:26:14.298714 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-lkbqq" podStartSLOduration=1.2986920180000001 podStartE2EDuration="1.298692018s" podCreationTimestamp="2026-03-12 12:26:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:26:14.298119533 +0000 UTC m=+106.496135035" watchObservedRunningTime="2026-03-12 12:26:14.298692018 +0000 UTC m=+106.496707530" Mar 12 12:26:14.318037 master-0 kubenswrapper[13984]: I0312 12:26:14.317372 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" podStartSLOduration=74.570204607 podStartE2EDuration="1m18.317356325s" podCreationTimestamp="2026-03-12 12:24:56 +0000 UTC" firstStartedPulling="2026-03-12 12:26:09.972569901 +0000 UTC m=+102.170585393" lastFinishedPulling="2026-03-12 12:26:13.719721619 +0000 UTC m=+105.917737111" observedRunningTime="2026-03-12 12:26:14.316294406 +0000 UTC m=+106.514309908" watchObservedRunningTime="2026-03-12 12:26:14.317356325 +0000 UTC m=+106.515371817" Mar 12 12:26:14.594875 master-0 kubenswrapper[13984]: I0312 12:26:14.594781 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fk9sm" podUID="a6635c84-7bcc-48ce-a388-170ea85a8b4f" containerName="registry-server" probeResult="failure" output=< Mar 12 12:26:14.594875 master-0 kubenswrapper[13984]: timeout: failed to connect service ":50051" within 1s Mar 12 12:26:14.594875 master-0 kubenswrapper[13984]: > Mar 12 12:26:15.016207 master-0 kubenswrapper[13984]: I0312 12:26:15.016134 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:15.019071 master-0 kubenswrapper[13984]: I0312 12:26:15.019031 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:15.257272 master-0 kubenswrapper[13984]: I0312 12:26:15.257208 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:15.257958 master-0 kubenswrapper[13984]: I0312 12:26:15.257930 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:15.260715 master-0 kubenswrapper[13984]: I0312 12:26:15.260687 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-pczct" Mar 12 12:26:15.262639 master-0 kubenswrapper[13984]: I0312 12:26:15.262593 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-c6pf6" Mar 12 12:26:16.308745 master-0 kubenswrapper[13984]: I0312 12:26:16.308678 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8s844" Mar 12 12:26:16.309499 master-0 kubenswrapper[13984]: I0312 12:26:16.308764 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8s844" Mar 12 12:26:16.347028 master-0 kubenswrapper[13984]: I0312 12:26:16.346964 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8s844" Mar 12 12:26:16.376154 master-0 kubenswrapper[13984]: I0312 12:26:16.376095 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g"] Mar 12 12:26:16.377304 master-0 kubenswrapper[13984]: I0312 12:26:16.377271 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.380080 master-0 kubenswrapper[13984]: I0312 12:26:16.379839 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 12 12:26:16.380080 master-0 kubenswrapper[13984]: I0312 12:26:16.379839 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 12 12:26:16.380080 master-0 kubenswrapper[13984]: I0312 12:26:16.379978 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-rzflr" Mar 12 12:26:16.380080 master-0 kubenswrapper[13984]: I0312 12:26:16.379855 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 12 12:26:16.402056 master-0 kubenswrapper[13984]: I0312 12:26:16.402009 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g"] Mar 12 12:26:16.554378 master-0 kubenswrapper[13984]: I0312 12:26:16.554300 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpbpw\" (UniqueName: \"kubernetes.io/projected/87c8661b-4fad-4cfc-960e-5ab500121810-kube-api-access-zpbpw\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.554617 master-0 kubenswrapper[13984]: I0312 12:26:16.554405 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c8661b-4fad-4cfc-960e-5ab500121810-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.554617 master-0 kubenswrapper[13984]: I0312 12:26:16.554507 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87c8661b-4fad-4cfc-960e-5ab500121810-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.554617 master-0 kubenswrapper[13984]: I0312 12:26:16.554581 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/87c8661b-4fad-4cfc-960e-5ab500121810-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.656356 master-0 kubenswrapper[13984]: I0312 12:26:16.656232 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpbpw\" (UniqueName: \"kubernetes.io/projected/87c8661b-4fad-4cfc-960e-5ab500121810-kube-api-access-zpbpw\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.656356 master-0 kubenswrapper[13984]: I0312 12:26:16.656300 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c8661b-4fad-4cfc-960e-5ab500121810-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.656587 master-0 kubenswrapper[13984]: I0312 12:26:16.656360 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87c8661b-4fad-4cfc-960e-5ab500121810-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.656587 master-0 kubenswrapper[13984]: I0312 12:26:16.656420 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/87c8661b-4fad-4cfc-960e-5ab500121810-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.658782 master-0 kubenswrapper[13984]: I0312 12:26:16.658730 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/87c8661b-4fad-4cfc-960e-5ab500121810-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.660034 master-0 kubenswrapper[13984]: I0312 12:26:16.659997 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/87c8661b-4fad-4cfc-960e-5ab500121810-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.660809 master-0 kubenswrapper[13984]: I0312 12:26:16.660782 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/87c8661b-4fad-4cfc-960e-5ab500121810-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.688430 master-0 kubenswrapper[13984]: I0312 12:26:16.688345 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpbpw\" (UniqueName: \"kubernetes.io/projected/87c8661b-4fad-4cfc-960e-5ab500121810-kube-api-access-zpbpw\") pod \"prometheus-operator-5ff8674d55-mgk6g\" (UID: \"87c8661b-4fad-4cfc-960e-5ab500121810\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:16.694285 master-0 kubenswrapper[13984]: I0312 12:26:16.694241 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" Mar 12 12:26:17.120058 master-0 kubenswrapper[13984]: I0312 12:26:17.120017 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g"] Mar 12 12:26:17.122981 master-0 kubenswrapper[13984]: W0312 12:26:17.122923 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87c8661b_4fad_4cfc_960e_5ab500121810.slice/crio-b9ca8a0afb841ba41e244cc1ba815e690e89b4535449d99386ef46e065ea8c58 WatchSource:0}: Error finding container b9ca8a0afb841ba41e244cc1ba815e690e89b4535449d99386ef46e065ea8c58: Status 404 returned error can't find the container with id b9ca8a0afb841ba41e244cc1ba815e690e89b4535449d99386ef46e065ea8c58 Mar 12 12:26:17.268234 master-0 kubenswrapper[13984]: I0312 12:26:17.268169 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" event={"ID":"87c8661b-4fad-4cfc-960e-5ab500121810","Type":"ContainerStarted","Data":"b9ca8a0afb841ba41e244cc1ba815e690e89b4535449d99386ef46e065ea8c58"} Mar 12 12:26:17.327986 master-0 kubenswrapper[13984]: I0312 12:26:17.327933 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8s844" Mar 12 12:26:19.280066 master-0 kubenswrapper[13984]: I0312 12:26:19.280005 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" event={"ID":"87c8661b-4fad-4cfc-960e-5ab500121810","Type":"ContainerStarted","Data":"506388fec6f2d5d0e5572d730da8d3b548ea5d142f5f309fa8e00d05e79ff8ac"} Mar 12 12:26:19.280066 master-0 kubenswrapper[13984]: I0312 12:26:19.280046 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" event={"ID":"87c8661b-4fad-4cfc-960e-5ab500121810","Type":"ContainerStarted","Data":"9a7d1e7a27c9fd1a705a18f44427b3e74bffd6408cd3f949ddd2c3f222d73847"} Mar 12 12:26:19.305824 master-0 kubenswrapper[13984]: I0312 12:26:19.305748 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-mgk6g" podStartSLOduration=1.518993723 podStartE2EDuration="3.305726514s" podCreationTimestamp="2026-03-12 12:26:16 +0000 UTC" firstStartedPulling="2026-03-12 12:26:17.125677104 +0000 UTC m=+109.323692606" lastFinishedPulling="2026-03-12 12:26:18.912409905 +0000 UTC m=+111.110425397" observedRunningTime="2026-03-12 12:26:19.305679053 +0000 UTC m=+111.503694565" watchObservedRunningTime="2026-03-12 12:26:19.305726514 +0000 UTC m=+111.503742026" Mar 12 12:26:21.120415 master-0 kubenswrapper[13984]: I0312 12:26:21.120237 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:26:21.121450 master-0 kubenswrapper[13984]: I0312 12:26:21.120606 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:26:21.190741 master-0 kubenswrapper[13984]: I0312 12:26:21.190650 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:26:21.368733 master-0 kubenswrapper[13984]: I0312 12:26:21.368635 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mjrlw" Mar 12 12:26:21.752324 master-0 kubenswrapper[13984]: I0312 12:26:21.752273 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2"] Mar 12 12:26:21.753585 master-0 kubenswrapper[13984]: I0312 12:26:21.753543 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.757822 master-0 kubenswrapper[13984]: I0312 12:26:21.757783 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-brf42" Mar 12 12:26:21.757987 master-0 kubenswrapper[13984]: I0312 12:26:21.757970 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 12 12:26:21.758235 master-0 kubenswrapper[13984]: I0312 12:26:21.758197 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 12 12:26:21.758640 master-0 kubenswrapper[13984]: I0312 12:26:21.758610 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f89782fd-6737-4e13-898d-0683549c6673-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.758698 master-0 kubenswrapper[13984]: I0312 12:26:21.758676 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.758751 master-0 kubenswrapper[13984]: I0312 12:26:21.758733 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.758856 master-0 kubenswrapper[13984]: I0312 12:26:21.758827 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc7h5\" (UniqueName: \"kubernetes.io/projected/f89782fd-6737-4e13-898d-0683549c6673-kube-api-access-vc7h5\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.764200 master-0 kubenswrapper[13984]: I0312 12:26:21.764163 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-srg57"] Mar 12 12:26:21.765609 master-0 kubenswrapper[13984]: I0312 12:26:21.765581 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.767114 master-0 kubenswrapper[13984]: I0312 12:26:21.767092 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 12 12:26:21.767326 master-0 kubenswrapper[13984]: I0312 12:26:21.767308 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 12 12:26:21.767905 master-0 kubenswrapper[13984]: I0312 12:26:21.767743 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-6r5d5" Mar 12 12:26:21.777546 master-0 kubenswrapper[13984]: I0312 12:26:21.777508 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2"] Mar 12 12:26:21.790624 master-0 kubenswrapper[13984]: I0312 12:26:21.790535 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5"] Mar 12 12:26:21.791823 master-0 kubenswrapper[13984]: I0312 12:26:21.791797 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.793979 master-0 kubenswrapper[13984]: I0312 12:26:21.793911 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 12 12:26:21.794063 master-0 kubenswrapper[13984]: I0312 12:26:21.794025 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fszq7" Mar 12 12:26:21.794103 master-0 kubenswrapper[13984]: I0312 12:26:21.794070 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 12 12:26:21.794211 master-0 kubenswrapper[13984]: I0312 12:26:21.794186 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 12 12:26:21.822663 master-0 kubenswrapper[13984]: I0312 12:26:21.821671 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5"] Mar 12 12:26:21.859795 master-0 kubenswrapper[13984]: I0312 12:26:21.859728 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f89782fd-6737-4e13-898d-0683549c6673-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859810 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hd927\" (UniqueName: \"kubernetes.io/projected/9e18a6c1-574b-4191-8672-ca05718474d6-kube-api-access-hd927\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859847 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-tls\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859875 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859898 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-wtmp\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859921 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-root\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859952 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-textfile\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.859980 master-0 kubenswrapper[13984]: I0312 12:26:21.859973 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.859995 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.860029 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.860062 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zqx8\" (UniqueName: \"kubernetes.io/projected/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-api-access-7zqx8\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.860093 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/580c29d3-9f21-4949-8bfb-a7ec31f4da85-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.860117 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/580c29d3-9f21-4949-8bfb-a7ec31f4da85-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.860144 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.860172 master-0 kubenswrapper[13984]: I0312 12:26:21.860169 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e18a6c1-574b-4191-8672-ca05718474d6-metrics-client-ca\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.860361 master-0 kubenswrapper[13984]: I0312 12:26:21.860195 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc7h5\" (UniqueName: \"kubernetes.io/projected/f89782fd-6737-4e13-898d-0683549c6673-kube-api-access-vc7h5\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.860361 master-0 kubenswrapper[13984]: I0312 12:26:21.860222 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-sys\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.860361 master-0 kubenswrapper[13984]: I0312 12:26:21.860281 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.861277 master-0 kubenswrapper[13984]: I0312 12:26:21.861236 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f89782fd-6737-4e13-898d-0683549c6673-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.862837 master-0 kubenswrapper[13984]: E0312 12:26:21.862787 13984 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 12 12:26:21.862903 master-0 kubenswrapper[13984]: E0312 12:26:21.862887 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-tls podName:f89782fd-6737-4e13-898d-0683549c6673 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:22.362863873 +0000 UTC m=+114.560879365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-tls") pod "openshift-state-metrics-74cc79fd76-tgch2" (UID: "f89782fd-6737-4e13-898d-0683549c6673") : secret "openshift-state-metrics-tls" not found Mar 12 12:26:21.876869 master-0 kubenswrapper[13984]: I0312 12:26:21.869983 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.890656 master-0 kubenswrapper[13984]: I0312 12:26:21.886804 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc7h5\" (UniqueName: \"kubernetes.io/projected/f89782fd-6737-4e13-898d-0683549c6673-kube-api-access-vc7h5\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:21.962118 master-0 kubenswrapper[13984]: I0312 12:26:21.962051 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.962118 master-0 kubenswrapper[13984]: I0312 12:26:21.962109 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hd927\" (UniqueName: \"kubernetes.io/projected/9e18a6c1-574b-4191-8672-ca05718474d6-kube-api-access-hd927\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962381 master-0 kubenswrapper[13984]: E0312 12:26:21.962195 13984 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 12 12:26:21.962381 master-0 kubenswrapper[13984]: E0312 12:26:21.962245 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-tls podName:580c29d3-9f21-4949-8bfb-a7ec31f4da85 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:22.462229961 +0000 UTC m=+114.660245453 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-krrg5" (UID: "580c29d3-9f21-4949-8bfb-a7ec31f4da85") : secret "kube-state-metrics-tls" not found Mar 12 12:26:21.962381 master-0 kubenswrapper[13984]: I0312 12:26:21.962351 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-tls\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962542 master-0 kubenswrapper[13984]: I0312 12:26:21.962395 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-wtmp\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962542 master-0 kubenswrapper[13984]: E0312 12:26:21.962487 13984 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Mar 12 12:26:21.962542 master-0 kubenswrapper[13984]: E0312 12:26:21.962524 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-tls podName:9e18a6c1-574b-4191-8672-ca05718474d6 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:22.462513898 +0000 UTC m=+114.660529390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-tls") pod "node-exporter-srg57" (UID: "9e18a6c1-574b-4191-8672-ca05718474d6") : secret "node-exporter-tls" not found Mar 12 12:26:21.962542 master-0 kubenswrapper[13984]: I0312 12:26:21.962544 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-root\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962714 master-0 kubenswrapper[13984]: I0312 12:26:21.962578 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-textfile\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962714 master-0 kubenswrapper[13984]: I0312 12:26:21.962576 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-wtmp\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962714 master-0 kubenswrapper[13984]: I0312 12:26:21.962607 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-root\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962714 master-0 kubenswrapper[13984]: I0312 12:26:21.962639 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.962714 master-0 kubenswrapper[13984]: I0312 12:26:21.962666 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.963053 master-0 kubenswrapper[13984]: I0312 12:26:21.962970 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-textfile\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.963705 master-0 kubenswrapper[13984]: I0312 12:26:21.963658 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zqx8\" (UniqueName: \"kubernetes.io/projected/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-api-access-7zqx8\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.963782 master-0 kubenswrapper[13984]: I0312 12:26:21.963745 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/580c29d3-9f21-4949-8bfb-a7ec31f4da85-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.963832 master-0 kubenswrapper[13984]: I0312 12:26:21.963781 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/580c29d3-9f21-4949-8bfb-a7ec31f4da85-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.963832 master-0 kubenswrapper[13984]: I0312 12:26:21.963819 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.963919 master-0 kubenswrapper[13984]: I0312 12:26:21.963852 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e18a6c1-574b-4191-8672-ca05718474d6-metrics-client-ca\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.963919 master-0 kubenswrapper[13984]: I0312 12:26:21.963899 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-sys\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.964087 master-0 kubenswrapper[13984]: I0312 12:26:21.964062 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9e18a6c1-574b-4191-8672-ca05718474d6-sys\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.964711 master-0 kubenswrapper[13984]: I0312 12:26:21.964686 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/580c29d3-9f21-4949-8bfb-a7ec31f4da85-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.964781 master-0 kubenswrapper[13984]: I0312 12:26:21.964753 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/580c29d3-9f21-4949-8bfb-a7ec31f4da85-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.973671 master-0 kubenswrapper[13984]: I0312 12:26:21.964923 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9e18a6c1-574b-4191-8672-ca05718474d6-metrics-client-ca\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.973671 master-0 kubenswrapper[13984]: I0312 12:26:21.965374 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.973671 master-0 kubenswrapper[13984]: I0312 12:26:21.965554 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:21.987117 master-0 kubenswrapper[13984]: I0312 12:26:21.987071 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.989397 master-0 kubenswrapper[13984]: I0312 12:26:21.989352 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zqx8\" (UniqueName: \"kubernetes.io/projected/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-api-access-7zqx8\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:21.990927 master-0 kubenswrapper[13984]: I0312 12:26:21.990890 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hd927\" (UniqueName: \"kubernetes.io/projected/9e18a6c1-574b-4191-8672-ca05718474d6-kube-api-access-hd927\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:22.366682 master-0 kubenswrapper[13984]: I0312 12:26:22.366632 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:22.370505 master-0 kubenswrapper[13984]: I0312 12:26:22.370440 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f89782fd-6737-4e13-898d-0683549c6673-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-tgch2\" (UID: \"f89782fd-6737-4e13-898d-0683549c6673\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:22.372854 master-0 kubenswrapper[13984]: I0312 12:26:22.372794 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" Mar 12 12:26:22.468911 master-0 kubenswrapper[13984]: I0312 12:26:22.468841 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:22.469118 master-0 kubenswrapper[13984]: I0312 12:26:22.468942 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-tls\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:22.472444 master-0 kubenswrapper[13984]: I0312 12:26:22.472355 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9e18a6c1-574b-4191-8672-ca05718474d6-node-exporter-tls\") pod \"node-exporter-srg57\" (UID: \"9e18a6c1-574b-4191-8672-ca05718474d6\") " pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:22.473695 master-0 kubenswrapper[13984]: I0312 12:26:22.473631 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/580c29d3-9f21-4949-8bfb-a7ec31f4da85-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-krrg5\" (UID: \"580c29d3-9f21-4949-8bfb-a7ec31f4da85\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:22.687407 master-0 kubenswrapper[13984]: I0312 12:26:22.687292 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-srg57" Mar 12 12:26:22.742545 master-0 kubenswrapper[13984]: I0312 12:26:22.738661 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" Mar 12 12:26:22.822546 master-0 kubenswrapper[13984]: I0312 12:26:22.817032 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:26:22.822546 master-0 kubenswrapper[13984]: I0312 12:26:22.819494 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.828797 master-0 kubenswrapper[13984]: I0312 12:26:22.828154 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 12 12:26:22.828797 master-0 kubenswrapper[13984]: I0312 12:26:22.828239 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 12 12:26:22.828797 master-0 kubenswrapper[13984]: I0312 12:26:22.828408 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 12 12:26:22.828797 master-0 kubenswrapper[13984]: I0312 12:26:22.828548 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 12 12:26:22.828797 master-0 kubenswrapper[13984]: I0312 12:26:22.828655 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 12 12:26:22.829083 master-0 kubenswrapper[13984]: I0312 12:26:22.828867 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 12 12:26:22.829129 master-0 kubenswrapper[13984]: I0312 12:26:22.829085 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 12 12:26:22.829284 master-0 kubenswrapper[13984]: I0312 12:26:22.829267 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 12 12:26:22.832532 master-0 kubenswrapper[13984]: I0312 12:26:22.832036 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-24x8c" Mar 12 12:26:22.862750 master-0 kubenswrapper[13984]: I0312 12:26:22.846512 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:26:22.870694 master-0 kubenswrapper[13984]: I0312 12:26:22.870633 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2"] Mar 12 12:26:22.978816 master-0 kubenswrapper[13984]: I0312 12:26:22.978763 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.978999 master-0 kubenswrapper[13984]: I0312 12:26:22.978843 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-web-config\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.978999 master-0 kubenswrapper[13984]: I0312 12:26:22.978942 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-out\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.978999 master-0 kubenswrapper[13984]: I0312 12:26:22.978984 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979089 master-0 kubenswrapper[13984]: I0312 12:26:22.979020 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979089 master-0 kubenswrapper[13984]: I0312 12:26:22.979062 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-volume\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979145 master-0 kubenswrapper[13984]: I0312 12:26:22.979105 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979178 master-0 kubenswrapper[13984]: I0312 12:26:22.979149 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979208 master-0 kubenswrapper[13984]: I0312 12:26:22.979176 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-tls-assets\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979241 master-0 kubenswrapper[13984]: I0312 12:26:22.979195 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979241 master-0 kubenswrapper[13984]: I0312 12:26:22.979232 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:22.979299 master-0 kubenswrapper[13984]: I0312 12:26:22.979250 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnnth\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-kube-api-access-tnnth\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.080927 master-0 kubenswrapper[13984]: I0312 12:26:23.080872 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-out\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.080927 master-0 kubenswrapper[13984]: I0312 12:26:23.080917 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.080944 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.080969 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-volume\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081004 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081029 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081051 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-tls-assets\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081066 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081079 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081094 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnnth\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-kube-api-access-tnnth\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081116 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.081145 master-0 kubenswrapper[13984]: I0312 12:26:23.081137 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-web-config\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.083358 master-0 kubenswrapper[13984]: I0312 12:26:23.083327 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.083652 master-0 kubenswrapper[13984]: I0312 12:26:23.083621 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.084024 master-0 kubenswrapper[13984]: E0312 12:26:23.083885 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:23.583873674 +0000 UTC m=+115.781889166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:23.085540 master-0 kubenswrapper[13984]: I0312 12:26:23.085436 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-tls-assets\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.086605 master-0 kubenswrapper[13984]: I0312 12:26:23.086582 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-web-config\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.087372 master-0 kubenswrapper[13984]: I0312 12:26:23.087340 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.088764 master-0 kubenswrapper[13984]: I0312 12:26:23.088662 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.088858 master-0 kubenswrapper[13984]: I0312 12:26:23.088768 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-out\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.089934 master-0 kubenswrapper[13984]: I0312 12:26:23.089908 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.095546 master-0 kubenswrapper[13984]: I0312 12:26:23.090764 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-volume\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.095546 master-0 kubenswrapper[13984]: I0312 12:26:23.091931 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.109320 master-0 kubenswrapper[13984]: I0312 12:26:23.109259 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnnth\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-kube-api-access-tnnth\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.242247 master-0 kubenswrapper[13984]: I0312 12:26:23.242193 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5"] Mar 12 12:26:23.247082 master-0 kubenswrapper[13984]: W0312 12:26:23.247027 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod580c29d3_9f21_4949_8bfb_a7ec31f4da85.slice/crio-9f821198998092acab91624e5224211f95b4633f58ec29a182e4903adf9b1883 WatchSource:0}: Error finding container 9f821198998092acab91624e5224211f95b4633f58ec29a182e4903adf9b1883: Status 404 returned error can't find the container with id 9f821198998092acab91624e5224211f95b4633f58ec29a182e4903adf9b1883 Mar 12 12:26:23.311500 master-0 kubenswrapper[13984]: I0312 12:26:23.308345 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" event={"ID":"580c29d3-9f21-4949-8bfb-a7ec31f4da85","Type":"ContainerStarted","Data":"9f821198998092acab91624e5224211f95b4633f58ec29a182e4903adf9b1883"} Mar 12 12:26:23.318319 master-0 kubenswrapper[13984]: I0312 12:26:23.318270 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" event={"ID":"f89782fd-6737-4e13-898d-0683549c6673","Type":"ContainerStarted","Data":"1adcd652fff15cc1ba3ab9fd4bcc11b9a4f4737b0f0db9e215cab8d7c711c68e"} Mar 12 12:26:23.318319 master-0 kubenswrapper[13984]: I0312 12:26:23.318316 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" event={"ID":"f89782fd-6737-4e13-898d-0683549c6673","Type":"ContainerStarted","Data":"7eccb4ae1f11cf80104f3c9b332589230c61a6eb36eaf7cea4820090072e7e0f"} Mar 12 12:26:23.318319 master-0 kubenswrapper[13984]: I0312 12:26:23.318328 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" event={"ID":"f89782fd-6737-4e13-898d-0683549c6673","Type":"ContainerStarted","Data":"86b3084c830bda6bfc65e9aa9971318581abe396e30a28bb89ac5b01d737c91d"} Mar 12 12:26:23.319955 master-0 kubenswrapper[13984]: I0312 12:26:23.319927 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srg57" event={"ID":"9e18a6c1-574b-4191-8672-ca05718474d6","Type":"ContainerStarted","Data":"ec9ba987f5f685b706ed25af163bf8bd41891375855f85d91d0c01ba1dfba864"} Mar 12 12:26:23.338346 master-0 kubenswrapper[13984]: I0312 12:26:23.337842 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kjp42" Mar 12 12:26:23.588445 master-0 kubenswrapper[13984]: I0312 12:26:23.588330 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:23.588896 master-0 kubenswrapper[13984]: E0312 12:26:23.588670 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:24.58864356 +0000 UTC m=+116.786659092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:23.611415 master-0 kubenswrapper[13984]: I0312 12:26:23.611354 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:26:23.650822 master-0 kubenswrapper[13984]: I0312 12:26:23.650771 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fk9sm" Mar 12 12:26:23.948598 master-0 kubenswrapper[13984]: I0312 12:26:23.948550 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-c5678cd69-z9dpl"] Mar 12 12:26:23.951492 master-0 kubenswrapper[13984]: I0312 12:26:23.951440 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:23.956937 master-0 kubenswrapper[13984]: I0312 12:26:23.955381 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 12 12:26:23.956937 master-0 kubenswrapper[13984]: I0312 12:26:23.955631 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-ljrjz" Mar 12 12:26:23.956937 master-0 kubenswrapper[13984]: I0312 12:26:23.955788 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 12 12:26:23.956937 master-0 kubenswrapper[13984]: I0312 12:26:23.956204 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 12 12:26:23.956937 master-0 kubenswrapper[13984]: I0312 12:26:23.956871 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 12 12:26:23.957311 master-0 kubenswrapper[13984]: I0312 12:26:23.957015 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 12 12:26:23.957621 master-0 kubenswrapper[13984]: I0312 12:26:23.957584 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-37hhdc2hpri03" Mar 12 12:26:23.966708 master-0 kubenswrapper[13984]: I0312 12:26:23.965451 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c5678cd69-z9dpl"] Mar 12 12:26:24.104411 master-0 kubenswrapper[13984]: I0312 12:26:24.104212 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104411 master-0 kubenswrapper[13984]: I0312 12:26:24.104275 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-tls\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104411 master-0 kubenswrapper[13984]: I0312 12:26:24.104301 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-grpc-tls\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104411 master-0 kubenswrapper[13984]: I0312 12:26:24.104374 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104411 master-0 kubenswrapper[13984]: I0312 12:26:24.104419 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/015b85c8-1d75-4de2-94b7-1b43b43219a0-metrics-client-ca\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104846 master-0 kubenswrapper[13984]: I0312 12:26:24.104443 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcw28\" (UniqueName: \"kubernetes.io/projected/015b85c8-1d75-4de2-94b7-1b43b43219a0-kube-api-access-dcw28\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104846 master-0 kubenswrapper[13984]: I0312 12:26:24.104468 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.104846 master-0 kubenswrapper[13984]: I0312 12:26:24.104555 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.206565 master-0 kubenswrapper[13984]: I0312 12:26:24.205940 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.206565 master-0 kubenswrapper[13984]: I0312 12:26:24.206034 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/015b85c8-1d75-4de2-94b7-1b43b43219a0-metrics-client-ca\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.206950 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/015b85c8-1d75-4de2-94b7-1b43b43219a0-metrics-client-ca\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.207007 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcw28\" (UniqueName: \"kubernetes.io/projected/015b85c8-1d75-4de2-94b7-1b43b43219a0-kube-api-access-dcw28\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.207029 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.207088 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.207126 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.207146 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-tls\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.207161 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-grpc-tls\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.210497 master-0 kubenswrapper[13984]: I0312 12:26:24.209057 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.214493 master-0 kubenswrapper[13984]: I0312 12:26:24.212041 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.218176 master-0 kubenswrapper[13984]: I0312 12:26:24.218132 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.218900 master-0 kubenswrapper[13984]: I0312 12:26:24.218876 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-tls\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.219393 master-0 kubenswrapper[13984]: I0312 12:26:24.219354 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.219445 master-0 kubenswrapper[13984]: I0312 12:26:24.219389 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/015b85c8-1d75-4de2-94b7-1b43b43219a0-secret-grpc-tls\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.237352 master-0 kubenswrapper[13984]: I0312 12:26:24.237302 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcw28\" (UniqueName: \"kubernetes.io/projected/015b85c8-1d75-4de2-94b7-1b43b43219a0-kube-api-access-dcw28\") pod \"thanos-querier-c5678cd69-z9dpl\" (UID: \"015b85c8-1d75-4de2-94b7-1b43b43219a0\") " pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.288545 master-0 kubenswrapper[13984]: I0312 12:26:24.288347 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:24.612250 master-0 kubenswrapper[13984]: I0312 12:26:24.612135 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:24.612723 master-0 kubenswrapper[13984]: E0312 12:26:24.612376 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:26.612352564 +0000 UTC m=+118.810368056 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:24.794171 master-0 kubenswrapper[13984]: I0312 12:26:24.793403 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-c5678cd69-z9dpl"] Mar 12 12:26:25.092064 master-0 kubenswrapper[13984]: W0312 12:26:25.092006 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod015b85c8_1d75_4de2_94b7_1b43b43219a0.slice/crio-d157031a33c1499fbeea246d7c6eb8849e487dd6e9fef63c4cb6563c573a657d WatchSource:0}: Error finding container d157031a33c1499fbeea246d7c6eb8849e487dd6e9fef63c4cb6563c573a657d: Status 404 returned error can't find the container with id d157031a33c1499fbeea246d7c6eb8849e487dd6e9fef63c4cb6563c573a657d Mar 12 12:26:25.331542 master-0 kubenswrapper[13984]: I0312 12:26:25.331485 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"d157031a33c1499fbeea246d7c6eb8849e487dd6e9fef63c4cb6563c573a657d"} Mar 12 12:26:26.345764 master-0 kubenswrapper[13984]: I0312 12:26:26.345384 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" event={"ID":"580c29d3-9f21-4949-8bfb-a7ec31f4da85","Type":"ContainerStarted","Data":"8d026e72802190f7f3d527f2f6eb6b59e27d348cf5aa58da5692b14a7265f4d3"} Mar 12 12:26:26.345764 master-0 kubenswrapper[13984]: I0312 12:26:26.345466 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" event={"ID":"580c29d3-9f21-4949-8bfb-a7ec31f4da85","Type":"ContainerStarted","Data":"f36f13bef61002cbb3fdb19ac4f734e26ab535d4b050319e84bcf742eaa1fe05"} Mar 12 12:26:26.345764 master-0 kubenswrapper[13984]: I0312 12:26:26.345719 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" event={"ID":"580c29d3-9f21-4949-8bfb-a7ec31f4da85","Type":"ContainerStarted","Data":"c8abad1b67515320fc1f54d9638251fb7e9d9ff8458d3491f811166ea718fbc3"} Mar 12 12:26:26.352560 master-0 kubenswrapper[13984]: I0312 12:26:26.352156 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" event={"ID":"f89782fd-6737-4e13-898d-0683549c6673","Type":"ContainerStarted","Data":"243efb7f5a08c3fb715fd1b06b690622392c291e3c5cf2bead27b7323638783d"} Mar 12 12:26:26.371243 master-0 kubenswrapper[13984]: I0312 12:26:26.370581 13984 generic.go:334] "Generic (PLEG): container finished" podID="9e18a6c1-574b-4191-8672-ca05718474d6" containerID="97dd0e7f5e14feda64dadf6a76ba2e4ba0b9ee14a751aa1566dbde1880b771b1" exitCode=0 Mar 12 12:26:26.371243 master-0 kubenswrapper[13984]: I0312 12:26:26.370702 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srg57" event={"ID":"9e18a6c1-574b-4191-8672-ca05718474d6","Type":"ContainerDied","Data":"97dd0e7f5e14feda64dadf6a76ba2e4ba0b9ee14a751aa1566dbde1880b771b1"} Mar 12 12:26:26.390622 master-0 kubenswrapper[13984]: I0312 12:26:26.390007 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-krrg5" podStartSLOduration=3.198602671 podStartE2EDuration="5.389969838s" podCreationTimestamp="2026-03-12 12:26:21 +0000 UTC" firstStartedPulling="2026-03-12 12:26:23.256341227 +0000 UTC m=+115.454356719" lastFinishedPulling="2026-03-12 12:26:25.447708394 +0000 UTC m=+117.645723886" observedRunningTime="2026-03-12 12:26:26.384240312 +0000 UTC m=+118.582255884" watchObservedRunningTime="2026-03-12 12:26:26.389969838 +0000 UTC m=+118.587985380" Mar 12 12:26:26.453853 master-0 kubenswrapper[13984]: I0312 12:26:26.453743 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-tgch2" podStartSLOduration=3.216205399 podStartE2EDuration="5.453717309s" podCreationTimestamp="2026-03-12 12:26:21 +0000 UTC" firstStartedPulling="2026-03-12 12:26:23.200195313 +0000 UTC m=+115.398210805" lastFinishedPulling="2026-03-12 12:26:25.437707223 +0000 UTC m=+117.635722715" observedRunningTime="2026-03-12 12:26:26.452856305 +0000 UTC m=+118.650871867" watchObservedRunningTime="2026-03-12 12:26:26.453717309 +0000 UTC m=+118.651732841" Mar 12 12:26:26.641064 master-0 kubenswrapper[13984]: I0312 12:26:26.641016 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:26.641209 master-0 kubenswrapper[13984]: E0312 12:26:26.641183 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:30.641161698 +0000 UTC m=+122.839177190 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:27.058272 master-0 kubenswrapper[13984]: I0312 12:26:27.058208 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-55dcbc49f6-v92w7"] Mar 12 12:26:27.059277 master-0 kubenswrapper[13984]: I0312 12:26:27.059244 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.065354 master-0 kubenswrapper[13984]: I0312 12:26:27.065304 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-cr5kf" Mar 12 12:26:27.065567 master-0 kubenswrapper[13984]: I0312 12:26:27.065472 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 12 12:26:27.065631 master-0 kubenswrapper[13984]: I0312 12:26:27.065590 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-8vnkv64t8t82h" Mar 12 12:26:27.066291 master-0 kubenswrapper[13984]: I0312 12:26:27.065744 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 12 12:26:27.072357 master-0 kubenswrapper[13984]: I0312 12:26:27.066819 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 12 12:26:27.072357 master-0 kubenswrapper[13984]: I0312 12:26:27.066971 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 12 12:26:27.072357 master-0 kubenswrapper[13984]: I0312 12:26:27.070947 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55dcbc49f6-v92w7"] Mar 12 12:26:27.148496 master-0 kubenswrapper[13984]: I0312 12:26:27.148396 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-client-ca-bundle\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.148496 master-0 kubenswrapper[13984]: I0312 12:26:27.148471 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/68e3ba35-74e7-4437-94b6-d17430d4059c-audit-log\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.148747 master-0 kubenswrapper[13984]: I0312 12:26:27.148682 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-secret-metrics-client-certs\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.148851 master-0 kubenswrapper[13984]: I0312 12:26:27.148820 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/68e3ba35-74e7-4437-94b6-d17430d4059c-metrics-server-audit-profiles\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.148904 master-0 kubenswrapper[13984]: I0312 12:26:27.148872 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx6qz\" (UniqueName: \"kubernetes.io/projected/68e3ba35-74e7-4437-94b6-d17430d4059c-kube-api-access-fx6qz\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.148987 master-0 kubenswrapper[13984]: I0312 12:26:27.148961 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-secret-metrics-server-tls\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.149035 master-0 kubenswrapper[13984]: I0312 12:26:27.149008 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e3ba35-74e7-4437-94b6-d17430d4059c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.250865 master-0 kubenswrapper[13984]: I0312 12:26:27.250799 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/68e3ba35-74e7-4437-94b6-d17430d4059c-audit-log\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251085 master-0 kubenswrapper[13984]: I0312 12:26:27.250886 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-secret-metrics-client-certs\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251085 master-0 kubenswrapper[13984]: I0312 12:26:27.250959 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/68e3ba35-74e7-4437-94b6-d17430d4059c-metrics-server-audit-profiles\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251085 master-0 kubenswrapper[13984]: I0312 12:26:27.250994 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx6qz\" (UniqueName: \"kubernetes.io/projected/68e3ba35-74e7-4437-94b6-d17430d4059c-kube-api-access-fx6qz\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251085 master-0 kubenswrapper[13984]: I0312 12:26:27.251018 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-secret-metrics-server-tls\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251085 master-0 kubenswrapper[13984]: I0312 12:26:27.251047 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e3ba35-74e7-4437-94b6-d17430d4059c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251421 master-0 kubenswrapper[13984]: I0312 12:26:27.251090 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-client-ca-bundle\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.251421 master-0 kubenswrapper[13984]: I0312 12:26:27.251253 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/68e3ba35-74e7-4437-94b6-d17430d4059c-audit-log\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.253511 master-0 kubenswrapper[13984]: I0312 12:26:27.253337 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/68e3ba35-74e7-4437-94b6-d17430d4059c-metrics-server-audit-profiles\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.254024 master-0 kubenswrapper[13984]: I0312 12:26:27.253947 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/68e3ba35-74e7-4437-94b6-d17430d4059c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.255880 master-0 kubenswrapper[13984]: I0312 12:26:27.255852 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-client-ca-bundle\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.259073 master-0 kubenswrapper[13984]: I0312 12:26:27.259028 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-secret-metrics-server-tls\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.259188 master-0 kubenswrapper[13984]: I0312 12:26:27.259132 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/68e3ba35-74e7-4437-94b6-d17430d4059c-secret-metrics-client-certs\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.270139 master-0 kubenswrapper[13984]: I0312 12:26:27.270114 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx6qz\" (UniqueName: \"kubernetes.io/projected/68e3ba35-74e7-4437-94b6-d17430d4059c-kube-api-access-fx6qz\") pod \"metrics-server-55dcbc49f6-v92w7\" (UID: \"68e3ba35-74e7-4437-94b6-d17430d4059c\") " pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.388411 master-0 kubenswrapper[13984]: I0312 12:26:27.388272 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srg57" event={"ID":"9e18a6c1-574b-4191-8672-ca05718474d6","Type":"ContainerStarted","Data":"bbc898107b5b2283897e30068a1a7a28c2a43789bfd10cf3b625ec21507db2cd"} Mar 12 12:26:27.388411 master-0 kubenswrapper[13984]: I0312 12:26:27.388323 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-srg57" event={"ID":"9e18a6c1-574b-4191-8672-ca05718474d6","Type":"ContainerStarted","Data":"39c3657c8cf3419636640380fc4b0edc43b224bf764a5db00a45fc16464d24d0"} Mar 12 12:26:27.390494 master-0 kubenswrapper[13984]: I0312 12:26:27.390449 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:27.413536 master-0 kubenswrapper[13984]: I0312 12:26:27.413412 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-srg57" podStartSLOduration=3.689256323 podStartE2EDuration="6.413386855s" podCreationTimestamp="2026-03-12 12:26:21 +0000 UTC" firstStartedPulling="2026-03-12 12:26:22.713739615 +0000 UTC m=+114.911755107" lastFinishedPulling="2026-03-12 12:26:25.437870147 +0000 UTC m=+117.635885639" observedRunningTime="2026-03-12 12:26:27.409715725 +0000 UTC m=+119.607731277" watchObservedRunningTime="2026-03-12 12:26:27.413386855 +0000 UTC m=+119.611402387" Mar 12 12:26:27.549886 master-0 kubenswrapper[13984]: I0312 12:26:27.549848 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd"] Mar 12 12:26:27.550835 master-0 kubenswrapper[13984]: I0312 12:26:27.550819 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:27.553705 master-0 kubenswrapper[13984]: I0312 12:26:27.553689 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 12 12:26:27.558257 master-0 kubenswrapper[13984]: I0312 12:26:27.558238 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-lqvhq" Mar 12 12:26:27.563353 master-0 kubenswrapper[13984]: I0312 12:26:27.563330 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd"] Mar 12 12:26:27.658752 master-0 kubenswrapper[13984]: I0312 12:26:27.658555 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fdea4a82-0757-499a-9a74-df5f79373cbe-monitoring-plugin-cert\") pod \"monitoring-plugin-76d66fb5d5-l4mgd\" (UID: \"fdea4a82-0757-499a-9a74-df5f79373cbe\") " pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:27.759743 master-0 kubenswrapper[13984]: I0312 12:26:27.759531 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fdea4a82-0757-499a-9a74-df5f79373cbe-monitoring-plugin-cert\") pod \"monitoring-plugin-76d66fb5d5-l4mgd\" (UID: \"fdea4a82-0757-499a-9a74-df5f79373cbe\") " pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:27.763931 master-0 kubenswrapper[13984]: I0312 12:26:27.763911 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/fdea4a82-0757-499a-9a74-df5f79373cbe-monitoring-plugin-cert\") pod \"monitoring-plugin-76d66fb5d5-l4mgd\" (UID: \"fdea4a82-0757-499a-9a74-df5f79373cbe\") " pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:27.873785 master-0 kubenswrapper[13984]: I0312 12:26:27.873728 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:28.001890 master-0 kubenswrapper[13984]: I0312 12:26:28.001244 13984 scope.go:117] "RemoveContainer" containerID="823aef6fae9a1e0ac9ce3e87b09c4b094495f36691d261ce34a1a0d40c54755e" Mar 12 12:26:28.295794 master-0 kubenswrapper[13984]: I0312 12:26:28.295745 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:26:28.306709 master-0 kubenswrapper[13984]: I0312 12:26:28.306622 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.317710 master-0 kubenswrapper[13984]: I0312 12:26:28.317645 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 12 12:26:28.318035 master-0 kubenswrapper[13984]: I0312 12:26:28.318007 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-7vmf8" Mar 12 12:26:28.318333 master-0 kubenswrapper[13984]: I0312 12:26:28.318310 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 12 12:26:28.318689 master-0 kubenswrapper[13984]: I0312 12:26:28.318637 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 12 12:26:28.318801 master-0 kubenswrapper[13984]: I0312 12:26:28.318775 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 12 12:26:28.320658 master-0 kubenswrapper[13984]: I0312 12:26:28.318900 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 12 12:26:28.320658 master-0 kubenswrapper[13984]: I0312 12:26:28.319006 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 12 12:26:28.320658 master-0 kubenswrapper[13984]: I0312 12:26:28.319169 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 12 12:26:28.320658 master-0 kubenswrapper[13984]: I0312 12:26:28.319314 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 12 12:26:28.320658 master-0 kubenswrapper[13984]: I0312 12:26:28.319397 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 12 12:26:28.320658 master-0 kubenswrapper[13984]: I0312 12:26:28.320349 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 12 12:26:28.323904 master-0 kubenswrapper[13984]: I0312 12:26:28.320779 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:26:28.323904 master-0 kubenswrapper[13984]: I0312 12:26:28.321701 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-29i444egqnkse" Mar 12 12:26:28.326936 master-0 kubenswrapper[13984]: I0312 12:26:28.326907 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 12 12:26:28.381171 master-0 kubenswrapper[13984]: I0312 12:26:28.380468 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd"] Mar 12 12:26:28.407166 master-0 kubenswrapper[13984]: I0312 12:26:28.407123 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" event={"ID":"fdea4a82-0757-499a-9a74-df5f79373cbe","Type":"ContainerStarted","Data":"28216cc6467108bdc27003384ebf0b7c71a0090711702d086312c254bca54c39"} Mar 12 12:26:28.410667 master-0 kubenswrapper[13984]: I0312 12:26:28.410622 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"71dc9df8eaac7dbff29ca7145c1d51e6d084f66ad79ece207de888d0f21b7bdd"} Mar 12 12:26:28.410667 master-0 kubenswrapper[13984]: I0312 12:26:28.410658 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"0f323958d130c3a3bdbf74a176085b31fc09f9687778b435091953de29d006ca"} Mar 12 12:26:28.480004 master-0 kubenswrapper[13984]: I0312 12:26:28.479956 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55dcbc49f6-v92w7"] Mar 12 12:26:28.483241 master-0 kubenswrapper[13984]: I0312 12:26:28.483198 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483299 master-0 kubenswrapper[13984]: I0312 12:26:28.483250 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483299 master-0 kubenswrapper[13984]: I0312 12:26:28.483269 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95hq7\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-kube-api-access-95hq7\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483383 master-0 kubenswrapper[13984]: I0312 12:26:28.483299 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483383 master-0 kubenswrapper[13984]: I0312 12:26:28.483347 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483456 master-0 kubenswrapper[13984]: I0312 12:26:28.483403 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483456 master-0 kubenswrapper[13984]: I0312 12:26:28.483430 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483456 master-0 kubenswrapper[13984]: I0312 12:26:28.483447 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-web-config\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483607 master-0 kubenswrapper[13984]: I0312 12:26:28.483467 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483607 master-0 kubenswrapper[13984]: I0312 12:26:28.483594 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483676 master-0 kubenswrapper[13984]: I0312 12:26:28.483621 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483676 master-0 kubenswrapper[13984]: I0312 12:26:28.483641 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-config-out\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483676 master-0 kubenswrapper[13984]: I0312 12:26:28.483662 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-config\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483795 master-0 kubenswrapper[13984]: I0312 12:26:28.483743 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483795 master-0 kubenswrapper[13984]: I0312 12:26:28.483767 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483795 master-0 kubenswrapper[13984]: I0312 12:26:28.483783 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483906 master-0 kubenswrapper[13984]: I0312 12:26:28.483818 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.483977 master-0 kubenswrapper[13984]: I0312 12:26:28.483938 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.585350 master-0 kubenswrapper[13984]: I0312 12:26:28.585301 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.585611 master-0 kubenswrapper[13984]: I0312 12:26:28.585594 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-config-out\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.585942 master-0 kubenswrapper[13984]: I0312 12:26:28.585881 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-config\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.585992 master-0 kubenswrapper[13984]: I0312 12:26:28.585970 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.586035 master-0 kubenswrapper[13984]: I0312 12:26:28.586002 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.586035 master-0 kubenswrapper[13984]: I0312 12:26:28.586020 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.586845 master-0 kubenswrapper[13984]: I0312 12:26:28.586797 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.586922 master-0 kubenswrapper[13984]: I0312 12:26:28.586855 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.586970 master-0 kubenswrapper[13984]: I0312 12:26:28.586925 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.586970 master-0 kubenswrapper[13984]: I0312 12:26:28.586965 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587126 master-0 kubenswrapper[13984]: I0312 12:26:28.587081 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95hq7\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-kube-api-access-95hq7\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587203 master-0 kubenswrapper[13984]: I0312 12:26:28.587178 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587263 master-0 kubenswrapper[13984]: I0312 12:26:28.587242 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587263 master-0 kubenswrapper[13984]: I0312 12:26:28.587248 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587654 master-0 kubenswrapper[13984]: I0312 12:26:28.587625 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587725 master-0 kubenswrapper[13984]: I0312 12:26:28.587664 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587725 master-0 kubenswrapper[13984]: I0312 12:26:28.587683 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-web-config\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587725 master-0 kubenswrapper[13984]: I0312 12:26:28.587711 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587851 master-0 kubenswrapper[13984]: I0312 12:26:28.587747 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.587895 master-0 kubenswrapper[13984]: E0312 12:26:28.587868 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:29.087857302 +0000 UTC m=+121.285872794 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:28.588570 master-0 kubenswrapper[13984]: I0312 12:26:28.586859 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.588834 master-0 kubenswrapper[13984]: I0312 12:26:28.588791 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.588889 master-0 kubenswrapper[13984]: I0312 12:26:28.588838 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.589578 master-0 kubenswrapper[13984]: I0312 12:26:28.589554 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.590221 master-0 kubenswrapper[13984]: I0312 12:26:28.590189 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.591863 master-0 kubenswrapper[13984]: I0312 12:26:28.591807 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-config\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.593331 master-0 kubenswrapper[13984]: I0312 12:26:28.592953 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.593331 master-0 kubenswrapper[13984]: I0312 12:26:28.593228 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-config-out\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.595456 master-0 kubenswrapper[13984]: I0312 12:26:28.595406 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.596360 master-0 kubenswrapper[13984]: I0312 12:26:28.595903 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.596360 master-0 kubenswrapper[13984]: I0312 12:26:28.596145 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.596891 master-0 kubenswrapper[13984]: I0312 12:26:28.596807 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-web-config\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.597743 master-0 kubenswrapper[13984]: I0312 12:26:28.597700 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.598133 master-0 kubenswrapper[13984]: I0312 12:26:28.598081 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.603231 master-0 kubenswrapper[13984]: I0312 12:26:28.603185 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:28.610950 master-0 kubenswrapper[13984]: I0312 12:26:28.610898 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95hq7\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-kube-api-access-95hq7\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:29.097091 master-0 kubenswrapper[13984]: I0312 12:26:29.096975 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:29.097468 master-0 kubenswrapper[13984]: E0312 12:26:29.097283 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:30.097246453 +0000 UTC m=+122.295261945 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:29.420813 master-0 kubenswrapper[13984]: I0312 12:26:29.420699 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"4aa66c38b954b984daf02eeeae49b65c9cef140d1dd08f32f07cc4f42406ef4c"} Mar 12 12:26:29.422526 master-0 kubenswrapper[13984]: I0312 12:26:29.422464 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" event={"ID":"68e3ba35-74e7-4437-94b6-d17430d4059c","Type":"ContainerStarted","Data":"3c13f48478d1486c45378190d746616c0388aee61a4ccb3321c8893cccd1f85d"} Mar 12 12:26:30.115409 master-0 kubenswrapper[13984]: I0312 12:26:30.115342 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:30.115672 master-0 kubenswrapper[13984]: E0312 12:26:30.115450 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:32.115433778 +0000 UTC m=+124.313449270 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:30.432835 master-0 kubenswrapper[13984]: I0312 12:26:30.432724 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"4d4400e0dbded3cd0038df0f8311a73aaf1f455c92a3fcb68e62f8a7cff8d942"} Mar 12 12:26:30.728029 master-0 kubenswrapper[13984]: I0312 12:26:30.727980 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:30.728311 master-0 kubenswrapper[13984]: E0312 12:26:30.728282 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:38.728246946 +0000 UTC m=+130.926262478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:31.440693 master-0 kubenswrapper[13984]: I0312 12:26:31.440616 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" event={"ID":"68e3ba35-74e7-4437-94b6-d17430d4059c","Type":"ContainerStarted","Data":"b717274c00315125a4b54c6d66c902c672f5b6205c14c0823edaa84c9e138ed4"} Mar 12 12:26:31.442692 master-0 kubenswrapper[13984]: I0312 12:26:31.442656 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" event={"ID":"fdea4a82-0757-499a-9a74-df5f79373cbe","Type":"ContainerStarted","Data":"185b3eb892563b665b507be1823551f61aa37384b8349045afb3bebc2f7dca77"} Mar 12 12:26:31.443396 master-0 kubenswrapper[13984]: I0312 12:26:31.443240 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:31.447841 master-0 kubenswrapper[13984]: I0312 12:26:31.447773 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"02802201498ccd39c321d688170af37c8ec81f96b3eef65cefc9dabee8f12f85"} Mar 12 12:26:31.447841 master-0 kubenswrapper[13984]: I0312 12:26:31.447818 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" event={"ID":"015b85c8-1d75-4de2-94b7-1b43b43219a0","Type":"ContainerStarted","Data":"69ff1b399c624130dbe31ba1b37d843e99ee86e8ed89e6733e9a0f473f487f3f"} Mar 12 12:26:31.448114 master-0 kubenswrapper[13984]: I0312 12:26:31.448009 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:31.449201 master-0 kubenswrapper[13984]: I0312 12:26:31.449155 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" Mar 12 12:26:31.466410 master-0 kubenswrapper[13984]: I0312 12:26:31.466318 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" podStartSLOduration=2.442958039 podStartE2EDuration="4.466294375s" podCreationTimestamp="2026-03-12 12:26:27 +0000 UTC" firstStartedPulling="2026-03-12 12:26:28.486689365 +0000 UTC m=+120.684704857" lastFinishedPulling="2026-03-12 12:26:30.510025701 +0000 UTC m=+122.708041193" observedRunningTime="2026-03-12 12:26:31.465101433 +0000 UTC m=+123.663116935" watchObservedRunningTime="2026-03-12 12:26:31.466294375 +0000 UTC m=+123.664309907" Mar 12 12:26:31.499655 master-0 kubenswrapper[13984]: I0312 12:26:31.499550 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-76d66fb5d5-l4mgd" podStartSLOduration=2.3975652849999998 podStartE2EDuration="4.499522846s" podCreationTimestamp="2026-03-12 12:26:27 +0000 UTC" firstStartedPulling="2026-03-12 12:26:28.402822738 +0000 UTC m=+120.600838230" lastFinishedPulling="2026-03-12 12:26:30.504780279 +0000 UTC m=+122.702795791" observedRunningTime="2026-03-12 12:26:31.49155085 +0000 UTC m=+123.689566402" watchObservedRunningTime="2026-03-12 12:26:31.499522846 +0000 UTC m=+123.697538388" Mar 12 12:26:31.526132 master-0 kubenswrapper[13984]: I0312 12:26:31.525996 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" podStartSLOduration=4.061924971 podStartE2EDuration="8.525962694s" podCreationTimestamp="2026-03-12 12:26:23 +0000 UTC" firstStartedPulling="2026-03-12 12:26:25.103896309 +0000 UTC m=+117.301911811" lastFinishedPulling="2026-03-12 12:26:29.567934042 +0000 UTC m=+121.765949534" observedRunningTime="2026-03-12 12:26:31.524591517 +0000 UTC m=+123.722607009" watchObservedRunningTime="2026-03-12 12:26:31.525962694 +0000 UTC m=+123.723978226" Mar 12 12:26:32.148784 master-0 kubenswrapper[13984]: I0312 12:26:32.148670 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:32.149303 master-0 kubenswrapper[13984]: E0312 12:26:32.148912 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:36.148875567 +0000 UTC m=+128.346891099 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:34.297509 master-0 kubenswrapper[13984]: I0312 12:26:34.297429 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-c5678cd69-z9dpl" Mar 12 12:26:36.286244 master-0 kubenswrapper[13984]: I0312 12:26:36.285130 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:36.287530 master-0 kubenswrapper[13984]: E0312 12:26:36.287411 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:44.287391081 +0000 UTC m=+136.485406573 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:38.826711 master-0 kubenswrapper[13984]: I0312 12:26:38.826626 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:38.827257 master-0 kubenswrapper[13984]: E0312 12:26:38.826853 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:26:54.826824328 +0000 UTC m=+147.024839870 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:44.335190 master-0 kubenswrapper[13984]: I0312 12:26:44.335109 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:26:44.335977 master-0 kubenswrapper[13984]: E0312 12:26:44.335294 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:27:00.335278678 +0000 UTC m=+152.533294170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:45.148796 master-0 kubenswrapper[13984]: I0312 12:26:45.148368 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:26:45.148796 master-0 kubenswrapper[13984]: I0312 12:26:45.148453 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:26:45.149145 master-0 kubenswrapper[13984]: E0312 12:26:45.148808 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:26:45.149145 master-0 kubenswrapper[13984]: E0312 12:26:45.148851 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:26:45.149145 master-0 kubenswrapper[13984]: E0312 12:26:45.148932 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:28:47.148908628 +0000 UTC m=+259.346924130 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:26:45.149145 master-0 kubenswrapper[13984]: E0312 12:26:45.149016 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:26:45.149145 master-0 kubenswrapper[13984]: E0312 12:26:45.149076 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:26:45.149604 master-0 kubenswrapper[13984]: E0312 12:26:45.149196 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:28:47.149157735 +0000 UTC m=+259.347173267 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:26:47.391227 master-0 kubenswrapper[13984]: I0312 12:26:47.391178 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:47.391827 master-0 kubenswrapper[13984]: I0312 12:26:47.391252 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:26:54.894317 master-0 kubenswrapper[13984]: I0312 12:26:54.894234 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:26:54.895285 master-0 kubenswrapper[13984]: E0312 12:26:54.894419 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:27:26.894391842 +0000 UTC m=+179.092407324 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:26:57.365675 master-0 kubenswrapper[13984]: I0312 12:26:57.365566 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 12 12:26:57.367162 master-0 kubenswrapper[13984]: I0312 12:26:57.367119 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.371166 master-0 kubenswrapper[13984]: I0312 12:26:57.371098 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 12 12:26:57.371702 master-0 kubenswrapper[13984]: I0312 12:26:57.371620 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-kqf6h" Mar 12 12:26:57.385526 master-0 kubenswrapper[13984]: I0312 12:26:57.385394 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 12 12:26:57.433781 master-0 kubenswrapper[13984]: I0312 12:26:57.433713 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ef1ec5-3e17-485a-9797-566ef207fa0a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.434021 master-0 kubenswrapper[13984]: I0312 12:26:57.433794 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.434021 master-0 kubenswrapper[13984]: I0312 12:26:57.433899 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-var-lock\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.535032 master-0 kubenswrapper[13984]: I0312 12:26:57.534970 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-var-lock\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.535273 master-0 kubenswrapper[13984]: I0312 12:26:57.535148 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-var-lock\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.535273 master-0 kubenswrapper[13984]: I0312 12:26:57.535266 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ef1ec5-3e17-485a-9797-566ef207fa0a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.535433 master-0 kubenswrapper[13984]: I0312 12:26:57.535407 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.535689 master-0 kubenswrapper[13984]: I0312 12:26:57.535660 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.571106 master-0 kubenswrapper[13984]: I0312 12:26:57.571021 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ef1ec5-3e17-485a-9797-566ef207fa0a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:57.699412 master-0 kubenswrapper[13984]: I0312 12:26:57.699333 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 12 12:26:58.319082 master-0 kubenswrapper[13984]: I0312 12:26:58.319027 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 12 12:26:58.321842 master-0 kubenswrapper[13984]: W0312 12:26:58.321741 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod51ef1ec5_3e17_485a_9797_566ef207fa0a.slice/crio-67784e43e2565e8ee0d24dbff6968398aa1240116dad0f3a51f0d9297f5983cf WatchSource:0}: Error finding container 67784e43e2565e8ee0d24dbff6968398aa1240116dad0f3a51f0d9297f5983cf: Status 404 returned error can't find the container with id 67784e43e2565e8ee0d24dbff6968398aa1240116dad0f3a51f0d9297f5983cf Mar 12 12:26:58.622455 master-0 kubenswrapper[13984]: I0312 12:26:58.622368 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"51ef1ec5-3e17-485a-9797-566ef207fa0a","Type":"ContainerStarted","Data":"67784e43e2565e8ee0d24dbff6968398aa1240116dad0f3a51f0d9297f5983cf"} Mar 12 12:26:59.251872 master-0 kubenswrapper[13984]: E0312 12:26:59.251811 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" podUID="c424f946-e2fe-4450-816b-b79640269ff5" Mar 12 12:26:59.630082 master-0 kubenswrapper[13984]: I0312 12:26:59.629970 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:26:59.630082 master-0 kubenswrapper[13984]: I0312 12:26:59.630020 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"51ef1ec5-3e17-485a-9797-566ef207fa0a","Type":"ContainerStarted","Data":"0fbabe3073d95ecbb958b15aeb524e556fc382f93d8cb6812d8af4c170725975"} Mar 12 12:26:59.652889 master-0 kubenswrapper[13984]: I0312 12:26:59.652798 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.6527779540000003 podStartE2EDuration="2.652777954s" podCreationTimestamp="2026-03-12 12:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:26:59.646019174 +0000 UTC m=+151.844034706" watchObservedRunningTime="2026-03-12 12:26:59.652777954 +0000 UTC m=+151.850793466" Mar 12 12:27:00.431166 master-0 kubenswrapper[13984]: I0312 12:27:00.429795 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:27:00.431166 master-0 kubenswrapper[13984]: E0312 12:27:00.430035 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:27:32.430011241 +0000 UTC m=+184.628026743 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:27:04.187629 master-0 kubenswrapper[13984]: I0312 12:27:04.187548 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:27:04.188181 master-0 kubenswrapper[13984]: E0312 12:27:04.187795 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:29:06.18776225 +0000 UTC m=+278.385777782 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:27:07.397500 master-0 kubenswrapper[13984]: I0312 12:27:07.397384 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:27:07.401616 master-0 kubenswrapper[13984]: I0312 12:27:07.401546 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55dcbc49f6-v92w7" Mar 12 12:27:26.977875 master-0 kubenswrapper[13984]: I0312 12:27:26.977734 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:27:26.979170 master-0 kubenswrapper[13984]: E0312 12:27:26.978075 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:28:30.978041528 +0000 UTC m=+243.176057030 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:27:29.775730 master-0 kubenswrapper[13984]: I0312 12:27:29.775665 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:27:29.776293 master-0 kubenswrapper[13984]: I0312 12:27:29.776002 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" containerID="cri-o://b0a55edaf166c34b98b5dcf71563b10a7151eb2e9e60290d0d641d4546feecf7" gracePeriod=30 Mar 12 12:27:29.776293 master-0 kubenswrapper[13984]: I0312 12:27:29.776059 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" containerID="cri-o://75e3bbf6064ff7f35e82389d300ce882963ffc8541436a1a80d239d3b971f5e4" gracePeriod=30 Mar 12 12:27:29.776293 master-0 kubenswrapper[13984]: I0312 12:27:29.776118 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" containerID="cri-o://49b7eed659093dd9559e33b3a0f638adb7dbba47c5e4ef561aa467ffc52b0eec" gracePeriod=30 Mar 12 12:27:29.776293 master-0 kubenswrapper[13984]: I0312 12:27:29.776170 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" containerID="cri-o://f7059097bb2ddbff3e211fcd38dee654f074ce10a4773944219fb7905e3d5723" gracePeriod=30 Mar 12 12:27:29.776452 master-0 kubenswrapper[13984]: I0312 12:27:29.776256 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" containerID="cri-o://b2f5073adcb260f7bafd0dbb4eae76a0d78ce200196488dbbf37a087b47f06a5" gracePeriod=30 Mar 12 12:27:29.782888 master-0 kubenswrapper[13984]: I0312 12:27:29.782825 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:27:29.783334 master-0 kubenswrapper[13984]: E0312 12:27:29.783295 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 12 12:27:29.783334 master-0 kubenswrapper[13984]: I0312 12:27:29.783329 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 12 12:27:29.783428 master-0 kubenswrapper[13984]: E0312 12:27:29.783363 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 12 12:27:29.783428 master-0 kubenswrapper[13984]: I0312 12:27:29.783380 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 12 12:27:29.783428 master-0 kubenswrapper[13984]: E0312 12:27:29.783411 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 12 12:27:29.783567 master-0 kubenswrapper[13984]: I0312 12:27:29.783428 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 12 12:27:29.783567 master-0 kubenswrapper[13984]: E0312 12:27:29.783456 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 12 12:27:29.783567 master-0 kubenswrapper[13984]: I0312 12:27:29.783473 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 12 12:27:29.783567 master-0 kubenswrapper[13984]: E0312 12:27:29.783525 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 12 12:27:29.783567 master-0 kubenswrapper[13984]: I0312 12:27:29.783542 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 12 12:27:29.783764 master-0 kubenswrapper[13984]: E0312 12:27:29.783572 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 12 12:27:29.783764 master-0 kubenswrapper[13984]: I0312 12:27:29.783590 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 12 12:27:29.783764 master-0 kubenswrapper[13984]: E0312 12:27:29.783615 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 12 12:27:29.783764 master-0 kubenswrapper[13984]: I0312 12:27:29.783631 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 12 12:27:29.783764 master-0 kubenswrapper[13984]: E0312 12:27:29.783671 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 12 12:27:29.783764 master-0 kubenswrapper[13984]: I0312 12:27:29.783689 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 12 12:27:29.784064 master-0 kubenswrapper[13984]: I0312 12:27:29.783973 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 12 12:27:29.784064 master-0 kubenswrapper[13984]: I0312 12:27:29.784030 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 12 12:27:29.784064 master-0 kubenswrapper[13984]: I0312 12:27:29.784059 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 12 12:27:29.784207 master-0 kubenswrapper[13984]: I0312 12:27:29.784080 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 12 12:27:29.784207 master-0 kubenswrapper[13984]: I0312 12:27:29.784104 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 12 12:27:29.784207 master-0 kubenswrapper[13984]: I0312 12:27:29.784127 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 12 12:27:29.784207 master-0 kubenswrapper[13984]: I0312 12:27:29.784178 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 12 12:27:29.784403 master-0 kubenswrapper[13984]: I0312 12:27:29.784212 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 12 12:27:29.933198 master-0 kubenswrapper[13984]: I0312 12:27:29.933116 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:29.933334 master-0 kubenswrapper[13984]: I0312 12:27:29.933263 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:29.933423 master-0 kubenswrapper[13984]: I0312 12:27:29.933376 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:29.933538 master-0 kubenswrapper[13984]: I0312 12:27:29.933461 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:29.933691 master-0 kubenswrapper[13984]: I0312 12:27:29.933647 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:29.933748 master-0 kubenswrapper[13984]: I0312 12:27:29.933719 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036132 master-0 kubenswrapper[13984]: I0312 12:27:30.035923 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036132 master-0 kubenswrapper[13984]: I0312 12:27:30.036019 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036132 master-0 kubenswrapper[13984]: I0312 12:27:30.036110 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036137 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036196 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036217 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036241 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036243 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036509 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.036647 master-0 kubenswrapper[13984]: I0312 12:27:30.036583 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.037164 master-0 kubenswrapper[13984]: I0312 12:27:30.036692 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.037164 master-0 kubenswrapper[13984]: I0312 12:27:30.036724 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 12 12:27:30.889998 master-0 kubenswrapper[13984]: I0312 12:27:30.889909 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 12 12:27:30.892716 master-0 kubenswrapper[13984]: I0312 12:27:30.892637 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 12 12:27:30.894718 master-0 kubenswrapper[13984]: I0312 12:27:30.894638 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="75e3bbf6064ff7f35e82389d300ce882963ffc8541436a1a80d239d3b971f5e4" exitCode=2 Mar 12 12:27:30.894718 master-0 kubenswrapper[13984]: I0312 12:27:30.894695 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="49b7eed659093dd9559e33b3a0f638adb7dbba47c5e4ef561aa467ffc52b0eec" exitCode=0 Mar 12 12:27:30.894718 master-0 kubenswrapper[13984]: I0312 12:27:30.894716 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="f7059097bb2ddbff3e211fcd38dee654f074ce10a4773944219fb7905e3d5723" exitCode=2 Mar 12 12:27:32.478244 master-0 kubenswrapper[13984]: I0312 12:27:32.478124 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:27:32.479383 master-0 kubenswrapper[13984]: E0312 12:27:32.478324 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:28:36.478306738 +0000 UTC m=+248.676322230 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:27:42.612932 master-0 kubenswrapper[13984]: I0312 12:27:42.612860 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 12 12:27:42.613545 master-0 kubenswrapper[13984]: I0312 12:27:42.612939 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 12 12:27:42.981926 master-0 kubenswrapper[13984]: I0312 12:27:42.981881 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:27:42.982129 master-0 kubenswrapper[13984]: I0312 12:27:42.981930 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="cfafaefe1186e2aaf0be8643dd0cbdc3cd2e42fe49f94ac73da95ad15ec95688" exitCode=1 Mar 12 12:27:42.982129 master-0 kubenswrapper[13984]: I0312 12:27:42.981976 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerDied","Data":"cfafaefe1186e2aaf0be8643dd0cbdc3cd2e42fe49f94ac73da95ad15ec95688"} Mar 12 12:27:42.982441 master-0 kubenswrapper[13984]: I0312 12:27:42.982410 13984 scope.go:117] "RemoveContainer" containerID="cfafaefe1186e2aaf0be8643dd0cbdc3cd2e42fe49f94ac73da95ad15ec95688" Mar 12 12:27:43.989462 master-0 kubenswrapper[13984]: I0312 12:27:43.989416 13984 generic.go:334] "Generic (PLEG): container finished" podID="51ef1ec5-3e17-485a-9797-566ef207fa0a" containerID="0fbabe3073d95ecbb958b15aeb524e556fc382f93d8cb6812d8af4c170725975" exitCode=0 Mar 12 12:27:43.990283 master-0 kubenswrapper[13984]: I0312 12:27:43.990237 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"51ef1ec5-3e17-485a-9797-566ef207fa0a","Type":"ContainerDied","Data":"0fbabe3073d95ecbb958b15aeb524e556fc382f93d8cb6812d8af4c170725975"} Mar 12 12:27:43.992969 master-0 kubenswrapper[13984]: I0312 12:27:43.992916 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:27:43.993037 master-0 kubenswrapper[13984]: I0312 12:27:43.993010 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"765bad2b5a42ad204d381d1e7b6c234c53c95f09dcfc3c8f5a96103e042780ef"} Mar 12 12:27:45.337287 master-0 kubenswrapper[13984]: I0312 12:27:45.337223 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 12 12:27:45.490575 master-0 kubenswrapper[13984]: I0312 12:27:45.490455 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ef1ec5-3e17-485a-9797-566ef207fa0a-kube-api-access\") pod \"51ef1ec5-3e17-485a-9797-566ef207fa0a\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " Mar 12 12:27:45.490815 master-0 kubenswrapper[13984]: I0312 12:27:45.490623 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-kubelet-dir\") pod \"51ef1ec5-3e17-485a-9797-566ef207fa0a\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " Mar 12 12:27:45.490815 master-0 kubenswrapper[13984]: I0312 12:27:45.490729 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "51ef1ec5-3e17-485a-9797-566ef207fa0a" (UID: "51ef1ec5-3e17-485a-9797-566ef207fa0a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:27:45.490815 master-0 kubenswrapper[13984]: I0312 12:27:45.490792 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-var-lock\") pod \"51ef1ec5-3e17-485a-9797-566ef207fa0a\" (UID: \"51ef1ec5-3e17-485a-9797-566ef207fa0a\") " Mar 12 12:27:45.490939 master-0 kubenswrapper[13984]: I0312 12:27:45.490857 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-var-lock" (OuterVolumeSpecName: "var-lock") pod "51ef1ec5-3e17-485a-9797-566ef207fa0a" (UID: "51ef1ec5-3e17-485a-9797-566ef207fa0a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:27:45.491386 master-0 kubenswrapper[13984]: I0312 12:27:45.491341 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:27:45.491444 master-0 kubenswrapper[13984]: I0312 12:27:45.491386 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/51ef1ec5-3e17-485a-9797-566ef207fa0a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:27:45.494309 master-0 kubenswrapper[13984]: I0312 12:27:45.494224 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ef1ec5-3e17-485a-9797-566ef207fa0a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "51ef1ec5-3e17-485a-9797-566ef207fa0a" (UID: "51ef1ec5-3e17-485a-9797-566ef207fa0a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:27:45.592620 master-0 kubenswrapper[13984]: I0312 12:27:45.592443 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/51ef1ec5-3e17-485a-9797-566ef207fa0a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:27:46.012135 master-0 kubenswrapper[13984]: I0312 12:27:46.012066 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"51ef1ec5-3e17-485a-9797-566ef207fa0a","Type":"ContainerDied","Data":"67784e43e2565e8ee0d24dbff6968398aa1240116dad0f3a51f0d9297f5983cf"} Mar 12 12:27:46.012135 master-0 kubenswrapper[13984]: I0312 12:27:46.012114 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67784e43e2565e8ee0d24dbff6968398aa1240116dad0f3a51f0d9297f5983cf" Mar 12 12:27:46.012562 master-0 kubenswrapper[13984]: I0312 12:27:46.012150 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 12 12:27:46.528115 master-0 kubenswrapper[13984]: E0312 12:27:46.528008 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:27:46.579647 master-0 kubenswrapper[13984]: I0312 12:27:46.579547 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:27:46.585436 master-0 kubenswrapper[13984]: I0312 12:27:46.585392 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:27:47.025208 master-0 kubenswrapper[13984]: I0312 12:27:47.025142 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="a9d9b5f96bde28a030172fa8b8562f0ad2738118cc137cd6bc087cf4fbc7972f" exitCode=1 Mar 12 12:27:47.025447 master-0 kubenswrapper[13984]: I0312 12:27:47.025269 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"a9d9b5f96bde28a030172fa8b8562f0ad2738118cc137cd6bc087cf4fbc7972f"} Mar 12 12:27:47.025447 master-0 kubenswrapper[13984]: I0312 12:27:47.025366 13984 scope.go:117] "RemoveContainer" containerID="727b627c806117dd5f2141d70aae9b4f04fa57747cadae9611a9e80d6ca1b04b" Mar 12 12:27:47.026159 master-0 kubenswrapper[13984]: I0312 12:27:47.026108 13984 scope.go:117] "RemoveContainer" containerID="a9d9b5f96bde28a030172fa8b8562f0ad2738118cc137cd6bc087cf4fbc7972f" Mar 12 12:27:47.026666 master-0 kubenswrapper[13984]: E0312 12:27:47.026612 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-scheduler\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-scheduler pod=bootstrap-kube-scheduler-master-0_kube-system(a1a56802af72ce1aac6b5077f1695ac0)\"" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" Mar 12 12:27:47.026763 master-0 kubenswrapper[13984]: I0312 12:27:47.026731 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:27:56.529339 master-0 kubenswrapper[13984]: E0312 12:27:56.529243 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:00.127825 master-0 kubenswrapper[13984]: I0312 12:28:00.127699 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 12 12:28:00.133978 master-0 kubenswrapper[13984]: I0312 12:28:00.133937 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 12 12:28:00.134668 master-0 kubenswrapper[13984]: I0312 12:28:00.134643 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 12 12:28:00.135585 master-0 kubenswrapper[13984]: I0312 12:28:00.135543 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 12 12:28:00.137735 master-0 kubenswrapper[13984]: I0312 12:28:00.137686 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="b2f5073adcb260f7bafd0dbb4eae76a0d78ce200196488dbbf37a087b47f06a5" exitCode=137 Mar 12 12:28:00.137735 master-0 kubenswrapper[13984]: I0312 12:28:00.137732 13984 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="b0a55edaf166c34b98b5dcf71563b10a7151eb2e9e60290d0d641d4546feecf7" exitCode=137 Mar 12 12:28:00.442921 master-0 kubenswrapper[13984]: I0312 12:28:00.442853 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 12 12:28:00.444124 master-0 kubenswrapper[13984]: I0312 12:28:00.444088 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 12 12:28:00.444836 master-0 kubenswrapper[13984]: I0312 12:28:00.444797 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 12 12:28:00.445590 master-0 kubenswrapper[13984]: I0312 12:28:00.445551 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 12 12:28:00.447074 master-0 kubenswrapper[13984]: I0312 12:28:00.447024 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 12 12:28:00.526075 master-0 kubenswrapper[13984]: I0312 12:28:00.526028 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 12 12:28:00.526075 master-0 kubenswrapper[13984]: I0312 12:28:00.526072 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 12 12:28:00.526351 master-0 kubenswrapper[13984]: I0312 12:28:00.526123 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 12 12:28:00.526351 master-0 kubenswrapper[13984]: I0312 12:28:00.526205 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 12 12:28:00.526351 master-0 kubenswrapper[13984]: I0312 12:28:00.526227 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 12 12:28:00.526351 master-0 kubenswrapper[13984]: I0312 12:28:00.526329 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:28:00.526545 master-0 kubenswrapper[13984]: I0312 12:28:00.526328 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir" (OuterVolumeSpecName: "log-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:28:00.526545 master-0 kubenswrapper[13984]: I0312 12:28:00.526361 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:28:00.526545 master-0 kubenswrapper[13984]: I0312 12:28:00.526517 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir" (OuterVolumeSpecName: "data-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:28:00.526682 master-0 kubenswrapper[13984]: I0312 12:28:00.526557 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 12 12:28:00.526682 master-0 kubenswrapper[13984]: I0312 12:28:00.526600 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:28:00.526774 master-0 kubenswrapper[13984]: I0312 12:28:00.526690 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:28:00.526989 master-0 kubenswrapper[13984]: I0312 12:28:00.526951 13984 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:28:00.526989 master-0 kubenswrapper[13984]: I0312 12:28:00.526980 13984 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:28:00.527066 master-0 kubenswrapper[13984]: I0312 12:28:00.526993 13984 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 12 12:28:00.527066 master-0 kubenswrapper[13984]: I0312 12:28:00.527005 13984 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:28:00.527066 master-0 kubenswrapper[13984]: I0312 12:28:00.527017 13984 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:28:00.527066 master-0 kubenswrapper[13984]: I0312 12:28:00.527028 13984 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:28:01.149289 master-0 kubenswrapper[13984]: I0312 12:28:01.149233 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 12 12:28:01.150847 master-0 kubenswrapper[13984]: I0312 12:28:01.150812 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 12 12:28:01.151471 master-0 kubenswrapper[13984]: I0312 12:28:01.151440 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 12 12:28:01.152597 master-0 kubenswrapper[13984]: I0312 12:28:01.152566 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 12 12:28:01.153839 master-0 kubenswrapper[13984]: I0312 12:28:01.153808 13984 scope.go:117] "RemoveContainer" containerID="75e3bbf6064ff7f35e82389d300ce882963ffc8541436a1a80d239d3b971f5e4" Mar 12 12:28:01.153971 master-0 kubenswrapper[13984]: I0312 12:28:01.153954 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 12 12:28:01.178209 master-0 kubenswrapper[13984]: I0312 12:28:01.178125 13984 scope.go:117] "RemoveContainer" containerID="49b7eed659093dd9559e33b3a0f638adb7dbba47c5e4ef561aa467ffc52b0eec" Mar 12 12:28:01.196140 master-0 kubenswrapper[13984]: I0312 12:28:01.196109 13984 scope.go:117] "RemoveContainer" containerID="f7059097bb2ddbff3e211fcd38dee654f074ce10a4773944219fb7905e3d5723" Mar 12 12:28:01.210100 master-0 kubenswrapper[13984]: I0312 12:28:01.210071 13984 scope.go:117] "RemoveContainer" containerID="b2f5073adcb260f7bafd0dbb4eae76a0d78ce200196488dbbf37a087b47f06a5" Mar 12 12:28:01.222872 master-0 kubenswrapper[13984]: I0312 12:28:01.222832 13984 scope.go:117] "RemoveContainer" containerID="b0a55edaf166c34b98b5dcf71563b10a7151eb2e9e60290d0d641d4546feecf7" Mar 12 12:28:01.233380 master-0 kubenswrapper[13984]: I0312 12:28:01.233339 13984 scope.go:117] "RemoveContainer" containerID="d7e05b92a4dbd9fdd6089f7478db7952000d8da47fb29fa9de9acabcf994c90c" Mar 12 12:28:01.245285 master-0 kubenswrapper[13984]: I0312 12:28:01.245261 13984 scope.go:117] "RemoveContainer" containerID="17df0049e355b3a960768281cd9fb4fe90537eac08f31c82188b349d802deef8" Mar 12 12:28:01.255976 master-0 kubenswrapper[13984]: E0312 12:28:01.255915 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-4e8520c5d1384053b96d4b2143e83dfab51a8dd7086c5302f88dafe1968cb0e8\": RecentStats: unable to find data in memory cache]" Mar 12 12:28:01.257313 master-0 kubenswrapper[13984]: I0312 12:28:01.257284 13984 scope.go:117] "RemoveContainer" containerID="c768108f96dad44b9e3bcf8d0d6db5eb9d2c1ac1b865d93bd4ff9f67e7bb635a" Mar 12 12:28:01.979835 master-0 kubenswrapper[13984]: I0312 12:28:01.979762 13984 scope.go:117] "RemoveContainer" containerID="a9d9b5f96bde28a030172fa8b8562f0ad2738118cc137cd6bc087cf4fbc7972f" Mar 12 12:28:01.990873 master-0 kubenswrapper[13984]: I0312 12:28:01.990815 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" path="/var/lib/kubelet/pods/8e52bef89f4b50e4590a1719bcc5d7e5/volumes" Mar 12 12:28:02.619172 master-0 kubenswrapper[13984]: I0312 12:28:02.619109 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:28:03.172006 master-0 kubenswrapper[13984]: I0312 12:28:03.171923 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"0d50ef84a902d60b2d7d410974b24e4f48e1a63818f16207d507864a9e96ea0a"} Mar 12 12:28:03.807688 master-0 kubenswrapper[13984]: E0312 12:28:03.807458 13984 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c17bc108bfcc9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:27:29.776032969 +0000 UTC m=+181.974048501,LastTimestamp:2026-03-12 12:27:29.776032969 +0000 UTC m=+181.974048501,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:28:03.979881 master-0 kubenswrapper[13984]: I0312 12:28:03.979751 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 12 12:28:04.016914 master-0 kubenswrapper[13984]: I0312 12:28:04.016829 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:28:04.016914 master-0 kubenswrapper[13984]: I0312 12:28:04.016874 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:28:06.530233 master-0 kubenswrapper[13984]: E0312 12:28:06.530145 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:11.126988 master-0 kubenswrapper[13984]: E0312 12:28:11.126809 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:28:01Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:28:01Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:28:01Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:28:01Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:16.530536 master-0 kubenswrapper[13984]: E0312 12:28:16.530398 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:21.129326 master-0 kubenswrapper[13984]: E0312 12:28:21.128974 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:25.351339 master-0 kubenswrapper[13984]: I0312 12:28:25.351261 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-rzmhl_51d58450-50bb-4da0-b1f6-4135fbabd856/approver/1.log" Mar 12 12:28:25.352589 master-0 kubenswrapper[13984]: I0312 12:28:25.352468 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-rzmhl_51d58450-50bb-4da0-b1f6-4135fbabd856/approver/0.log" Mar 12 12:28:25.353870 master-0 kubenswrapper[13984]: I0312 12:28:25.353714 13984 generic.go:334] "Generic (PLEG): container finished" podID="51d58450-50bb-4da0-b1f6-4135fbabd856" containerID="9ebb624a01176d8a7e4322c3422f03b83091e3497ad89b9a86f0622ff33645b0" exitCode=1 Mar 12 12:28:25.353870 master-0 kubenswrapper[13984]: I0312 12:28:25.353786 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerDied","Data":"9ebb624a01176d8a7e4322c3422f03b83091e3497ad89b9a86f0622ff33645b0"} Mar 12 12:28:25.354080 master-0 kubenswrapper[13984]: I0312 12:28:25.353908 13984 scope.go:117] "RemoveContainer" containerID="2c2b5cd50e4b41a7c3aafd02e56e622ce6b2150721ba8e3603b831c988a04475" Mar 12 12:28:25.354729 master-0 kubenswrapper[13984]: I0312 12:28:25.354675 13984 scope.go:117] "RemoveContainer" containerID="9ebb624a01176d8a7e4322c3422f03b83091e3497ad89b9a86f0622ff33645b0" Mar 12 12:28:25.862013 master-0 kubenswrapper[13984]: E0312 12:28:25.861893 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" Mar 12 12:28:26.362909 master-0 kubenswrapper[13984]: I0312 12:28:26.362851 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-rzmhl_51d58450-50bb-4da0-b1f6-4135fbabd856/approver/1.log" Mar 12 12:28:26.363824 master-0 kubenswrapper[13984]: I0312 12:28:26.363401 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-rzmhl" event={"ID":"51d58450-50bb-4da0-b1f6-4135fbabd856","Type":"ContainerStarted","Data":"7010d2ea6c28bc6a64e0375365821265d3c274e1ac6b39a42695b188ebf8ffd3"} Mar 12 12:28:26.363824 master-0 kubenswrapper[13984]: I0312 12:28:26.363522 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:28:26.531443 master-0 kubenswrapper[13984]: E0312 12:28:26.531363 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:26.531443 master-0 kubenswrapper[13984]: I0312 12:28:26.531423 13984 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 12:28:28.107238 master-0 kubenswrapper[13984]: I0312 12:28:28.107183 13984 scope.go:117] "RemoveContainer" containerID="b0b0a71bb15ee38a2037cc0d67a425037c9a862e431396ce17c0501ae76f6aae" Mar 12 12:28:28.133360 master-0 kubenswrapper[13984]: I0312 12:28:28.133287 13984 scope.go:117] "RemoveContainer" containerID="b315bdbdd00e7e78faaebae53e6e5aca4dcfbe013781ad0113a093ac0097dc1b" Mar 12 12:28:31.052712 master-0 kubenswrapper[13984]: I0312 12:28:31.052599 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:28:31.054020 master-0 kubenswrapper[13984]: E0312 12:28:31.052834 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:30:33.052801925 +0000 UTC m=+365.250817457 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:28:31.129880 master-0 kubenswrapper[13984]: E0312 12:28:31.129814 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:31.356884 master-0 kubenswrapper[13984]: E0312 12:28:31.356680 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" Mar 12 12:28:31.400389 master-0 kubenswrapper[13984]: I0312 12:28:31.400284 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:28:36.531880 master-0 kubenswrapper[13984]: E0312 12:28:36.531731 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 12 12:28:36.544074 master-0 kubenswrapper[13984]: I0312 12:28:36.543985 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:28:36.544306 master-0 kubenswrapper[13984]: E0312 12:28:36.544175 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:30:38.544153556 +0000 UTC m=+370.742169068 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:28:37.812110 master-0 kubenswrapper[13984]: E0312 12:28:37.811865 13984 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c17bc108d2eb2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:27:29.776111282 +0000 UTC m=+181.974126774,LastTimestamp:2026-03-12 12:27:29.776111282 +0000 UTC m=+181.974126774,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:28:38.018801 master-0 kubenswrapper[13984]: E0312 12:28:38.018739 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 12 12:28:38.019185 master-0 kubenswrapper[13984]: I0312 12:28:38.019156 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 12 12:28:38.038125 master-0 kubenswrapper[13984]: W0312 12:28:38.038061 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-529fa42ef09482dc681a76f1e8ec57aeec938451fb7b6d98da55f49033c501a1 WatchSource:0}: Error finding container 529fa42ef09482dc681a76f1e8ec57aeec938451fb7b6d98da55f49033c501a1: Status 404 returned error can't find the container with id 529fa42ef09482dc681a76f1e8ec57aeec938451fb7b6d98da55f49033c501a1 Mar 12 12:28:38.458027 master-0 kubenswrapper[13984]: I0312 12:28:38.457934 13984 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="b06ac028534447265b6c5c9fb5e43424959c6fe45f2925c70fb1eb110cacc0db" exitCode=0 Mar 12 12:28:38.458027 master-0 kubenswrapper[13984]: I0312 12:28:38.457999 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"b06ac028534447265b6c5c9fb5e43424959c6fe45f2925c70fb1eb110cacc0db"} Mar 12 12:28:38.458027 master-0 kubenswrapper[13984]: I0312 12:28:38.458040 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"529fa42ef09482dc681a76f1e8ec57aeec938451fb7b6d98da55f49033c501a1"} Mar 12 12:28:38.458527 master-0 kubenswrapper[13984]: I0312 12:28:38.458429 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:28:38.458654 master-0 kubenswrapper[13984]: I0312 12:28:38.458469 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:28:41.130736 master-0 kubenswrapper[13984]: E0312 12:28:41.130638 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:42.984457 master-0 kubenswrapper[13984]: I0312 12:28:42.984367 13984 status_manager.go:851] "Failed to get status for pod" podUID="161fce36d846c7ce98305d8ed6c23827" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods kube-controller-manager-master-0)" Mar 12 12:28:46.733932 master-0 kubenswrapper[13984]: E0312 12:28:46.733837 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="400ms" Mar 12 12:28:47.231662 master-0 kubenswrapper[13984]: I0312 12:28:47.231574 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:28:47.231662 master-0 kubenswrapper[13984]: I0312 12:28:47.231646 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:28:47.232095 master-0 kubenswrapper[13984]: E0312 12:28:47.231828 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:28:47.232095 master-0 kubenswrapper[13984]: E0312 12:28:47.231856 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:28:47.232095 master-0 kubenswrapper[13984]: E0312 12:28:47.231918 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:30:49.231897132 +0000 UTC m=+381.429912634 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:28:47.232095 master-0 kubenswrapper[13984]: E0312 12:28:47.232004 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:28:47.232095 master-0 kubenswrapper[13984]: E0312 12:28:47.232020 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:28:47.232095 master-0 kubenswrapper[13984]: E0312 12:28:47.232068 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:30:49.232053497 +0000 UTC m=+381.430069009 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:28:51.131653 master-0 kubenswrapper[13984]: E0312 12:28:51.131574 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:28:51.131653 master-0 kubenswrapper[13984]: E0312 12:28:51.131634 13984 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 12:28:57.135344 master-0 kubenswrapper[13984]: E0312 12:28:57.135241 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 12 12:29:02.631826 master-0 kubenswrapper[13984]: E0312 12:29:02.631735 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" podUID="c424f946-e2fe-4450-816b-b79640269ff5" Mar 12 12:29:02.650310 master-0 kubenswrapper[13984]: I0312 12:29:02.650252 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:29:05.677053 master-0 kubenswrapper[13984]: I0312 12:29:05.676963 13984 generic.go:334] "Generic (PLEG): container finished" podID="3c02552c-a477-4c6c-8a45-2fdc758c084b" containerID="26e67be1a2a9c4fa709a8c73f483e4f52cb5856a4a6c41d3c7dc95bd7856e253" exitCode=0 Mar 12 12:29:05.677053 master-0 kubenswrapper[13984]: I0312 12:29:05.677035 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" event={"ID":"3c02552c-a477-4c6c-8a45-2fdc758c084b","Type":"ContainerDied","Data":"26e67be1a2a9c4fa709a8c73f483e4f52cb5856a4a6c41d3c7dc95bd7856e253"} Mar 12 12:29:05.677744 master-0 kubenswrapper[13984]: I0312 12:29:05.677723 13984 scope.go:117] "RemoveContainer" containerID="26e67be1a2a9c4fa709a8c73f483e4f52cb5856a4a6c41d3c7dc95bd7856e253" Mar 12 12:29:06.252113 master-0 kubenswrapper[13984]: I0312 12:29:06.252041 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:29:06.252352 master-0 kubenswrapper[13984]: E0312 12:29:06.252227 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:31:08.25221083 +0000 UTC m=+400.450226322 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:29:06.688647 master-0 kubenswrapper[13984]: I0312 12:29:06.687744 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" event={"ID":"3c02552c-a477-4c6c-8a45-2fdc758c084b","Type":"ContainerStarted","Data":"5e01e31c815c5b711ffadabe13f58efe54f451d9ca7bac1da1a137d640b4680b"} Mar 12 12:29:06.688647 master-0 kubenswrapper[13984]: I0312 12:29:06.688093 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:29:06.690391 master-0 kubenswrapper[13984]: I0312 12:29:06.690349 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-rgstx" Mar 12 12:29:07.936536 master-0 kubenswrapper[13984]: E0312 12:29:07.936339 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 12 12:29:11.815615 master-0 kubenswrapper[13984]: E0312 12:29:11.815392 13984 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c17bc108deffb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:27:29.776160763 +0000 UTC m=+181.974176265,LastTimestamp:2026-03-12 12:27:29.776160763 +0000 UTC m=+181.974176265,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:29:12.462221 master-0 kubenswrapper[13984]: E0312 12:29:12.462160 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 12 12:29:12.737432 master-0 kubenswrapper[13984]: I0312 12:29:12.737377 13984 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="2d8029143f193f89fd33eb3260ab8a8779faf231bdece48fc138fe3b41f76e11" exitCode=0 Mar 12 12:29:12.737687 master-0 kubenswrapper[13984]: I0312 12:29:12.737439 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"2d8029143f193f89fd33eb3260ab8a8779faf231bdece48fc138fe3b41f76e11"} Mar 12 12:29:12.737808 master-0 kubenswrapper[13984]: I0312 12:29:12.737790 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:29:12.737842 master-0 kubenswrapper[13984]: I0312 12:29:12.737809 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:29:19.538164 master-0 kubenswrapper[13984]: E0312 12:29:19.538059 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 12 12:29:21.814203 master-0 kubenswrapper[13984]: I0312 12:29:21.814103 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/1.log" Mar 12 12:29:21.815527 master-0 kubenswrapper[13984]: I0312 12:29:21.815428 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/0.log" Mar 12 12:29:21.816440 master-0 kubenswrapper[13984]: I0312 12:29:21.816372 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/cluster-cloud-controller-manager/0.log" Mar 12 12:29:21.816656 master-0 kubenswrapper[13984]: I0312 12:29:21.816456 13984 generic.go:334] "Generic (PLEG): container finished" podID="81cb0504-9455-4398-aed1-5cc6790f292e" containerID="088c014670bf988535318a4419ded1bd38c960b3f737ad977d4457bc99c40c21" exitCode=1 Mar 12 12:29:21.816656 master-0 kubenswrapper[13984]: I0312 12:29:21.816546 13984 generic.go:334] "Generic (PLEG): container finished" podID="81cb0504-9455-4398-aed1-5cc6790f292e" containerID="ec76313e8f3101ce66ed46fb96c7f37aca18f8240494bdcfc27692fc83f7724a" exitCode=1 Mar 12 12:29:21.816656 master-0 kubenswrapper[13984]: I0312 12:29:21.816596 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" event={"ID":"81cb0504-9455-4398-aed1-5cc6790f292e","Type":"ContainerDied","Data":"088c014670bf988535318a4419ded1bd38c960b3f737ad977d4457bc99c40c21"} Mar 12 12:29:21.816656 master-0 kubenswrapper[13984]: I0312 12:29:21.816640 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" event={"ID":"81cb0504-9455-4398-aed1-5cc6790f292e","Type":"ContainerDied","Data":"ec76313e8f3101ce66ed46fb96c7f37aca18f8240494bdcfc27692fc83f7724a"} Mar 12 12:29:21.817030 master-0 kubenswrapper[13984]: I0312 12:29:21.816670 13984 scope.go:117] "RemoveContainer" containerID="db3a8ef068f83e5dd24c99272c185086b8f1f58c8b82fffeba41265fdc76efe5" Mar 12 12:29:21.817426 master-0 kubenswrapper[13984]: I0312 12:29:21.817358 13984 scope.go:117] "RemoveContainer" containerID="ec76313e8f3101ce66ed46fb96c7f37aca18f8240494bdcfc27692fc83f7724a" Mar 12 12:29:21.817426 master-0 kubenswrapper[13984]: I0312 12:29:21.817408 13984 scope.go:117] "RemoveContainer" containerID="088c014670bf988535318a4419ded1bd38c960b3f737ad977d4457bc99c40c21" Mar 12 12:29:22.828358 master-0 kubenswrapper[13984]: I0312 12:29:22.828275 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/1.log" Mar 12 12:29:22.829458 master-0 kubenswrapper[13984]: I0312 12:29:22.829394 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/cluster-cloud-controller-manager/0.log" Mar 12 12:29:22.829634 master-0 kubenswrapper[13984]: I0312 12:29:22.829522 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" event={"ID":"81cb0504-9455-4398-aed1-5cc6790f292e","Type":"ContainerStarted","Data":"179deacfaafd2641e17a86303d01cc8ddc0e5aef2d9ed79cb879741a9137427c"} Mar 12 12:29:22.829634 master-0 kubenswrapper[13984]: I0312 12:29:22.829568 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-pvjft" event={"ID":"81cb0504-9455-4398-aed1-5cc6790f292e","Type":"ContainerStarted","Data":"e953c4eba2980ba0fcadc5e664fda4e8dea666e9a609b86063cfa2fbe79dc775"} Mar 12 12:29:24.846591 master-0 kubenswrapper[13984]: I0312 12:29:24.846545 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2849p_aa8ddfdd-7f2d-4fd4-b666-1497dee752df/manager/0.log" Mar 12 12:29:24.846591 master-0 kubenswrapper[13984]: I0312 12:29:24.846586 13984 generic.go:334] "Generic (PLEG): container finished" podID="aa8ddfdd-7f2d-4fd4-b666-1497dee752df" containerID="0618c9f5aa46fc08303406a2cb903067ccdfab22a4fdf61ae7d07562528f34f0" exitCode=1 Mar 12 12:29:24.847162 master-0 kubenswrapper[13984]: I0312 12:29:24.846629 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" event={"ID":"aa8ddfdd-7f2d-4fd4-b666-1497dee752df","Type":"ContainerDied","Data":"0618c9f5aa46fc08303406a2cb903067ccdfab22a4fdf61ae7d07562528f34f0"} Mar 12 12:29:24.847162 master-0 kubenswrapper[13984]: I0312 12:29:24.847026 13984 scope.go:117] "RemoveContainer" containerID="0618c9f5aa46fc08303406a2cb903067ccdfab22a4fdf61ae7d07562528f34f0" Mar 12 12:29:24.849358 master-0 kubenswrapper[13984]: I0312 12:29:24.849326 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/0.log" Mar 12 12:29:24.849435 master-0 kubenswrapper[13984]: I0312 12:29:24.849403 13984 generic.go:334] "Generic (PLEG): container finished" podID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" containerID="7085fb6154550cf5e6b3b06a657089d39dcfd14f6f8134eed107ad674b6e5b62" exitCode=1 Mar 12 12:29:24.849560 master-0 kubenswrapper[13984]: I0312 12:29:24.849532 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerDied","Data":"7085fb6154550cf5e6b3b06a657089d39dcfd14f6f8134eed107ad674b6e5b62"} Mar 12 12:29:24.850217 master-0 kubenswrapper[13984]: I0312 12:29:24.850194 13984 scope.go:117] "RemoveContainer" containerID="7085fb6154550cf5e6b3b06a657089d39dcfd14f6f8134eed107ad674b6e5b62" Mar 12 12:29:24.863648 master-0 kubenswrapper[13984]: I0312 12:29:24.852451 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-pqph7_022dd526-0ea5-4224-9d2e-778ed4ef8a56/manager/0.log" Mar 12 12:29:24.864270 master-0 kubenswrapper[13984]: I0312 12:29:24.864235 13984 generic.go:334] "Generic (PLEG): container finished" podID="022dd526-0ea5-4224-9d2e-778ed4ef8a56" containerID="d5a1ce8cbbe911064b01e5c73e3e8c17228cb220219ccba9621023e45528295b" exitCode=1 Mar 12 12:29:24.864363 master-0 kubenswrapper[13984]: I0312 12:29:24.864341 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" event={"ID":"022dd526-0ea5-4224-9d2e-778ed4ef8a56","Type":"ContainerDied","Data":"d5a1ce8cbbe911064b01e5c73e3e8c17228cb220219ccba9621023e45528295b"} Mar 12 12:29:24.864745 master-0 kubenswrapper[13984]: I0312 12:29:24.864681 13984 scope.go:117] "RemoveContainer" containerID="d5a1ce8cbbe911064b01e5c73e3e8c17228cb220219ccba9621023e45528295b" Mar 12 12:29:25.873373 master-0 kubenswrapper[13984]: I0312 12:29:25.873282 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-2849p_aa8ddfdd-7f2d-4fd4-b666-1497dee752df/manager/0.log" Mar 12 12:29:25.874163 master-0 kubenswrapper[13984]: I0312 12:29:25.873384 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" event={"ID":"aa8ddfdd-7f2d-4fd4-b666-1497dee752df","Type":"ContainerStarted","Data":"18e354ed1ea4a642dfc3acbfe58a2e0ccf42924b53ba3f303e1eac8c1ca9a338"} Mar 12 12:29:25.874163 master-0 kubenswrapper[13984]: I0312 12:29:25.873769 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:29:25.875896 master-0 kubenswrapper[13984]: I0312 12:29:25.875858 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/0.log" Mar 12 12:29:25.876011 master-0 kubenswrapper[13984]: I0312 12:29:25.875962 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerStarted","Data":"8ee31e6fba1847a9978774591e03c71aa76ace6195ea06993968ed3542391f09"} Mar 12 12:29:25.878763 master-0 kubenswrapper[13984]: I0312 12:29:25.878727 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-pqph7_022dd526-0ea5-4224-9d2e-778ed4ef8a56/manager/0.log" Mar 12 12:29:25.879249 master-0 kubenswrapper[13984]: I0312 12:29:25.879209 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" event={"ID":"022dd526-0ea5-4224-9d2e-778ed4ef8a56","Type":"ContainerStarted","Data":"7d6076989fd7d343c7898e5d535c239cb1c7685171ad4ff2dacb470358f1183a"} Mar 12 12:29:25.879473 master-0 kubenswrapper[13984]: I0312 12:29:25.879444 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:29:27.973981 master-0 kubenswrapper[13984]: I0312 12:29:27.973905 13984 kubelet.go:1505] "Image garbage collection succeeded" Mar 12 12:29:30.808517 master-0 kubenswrapper[13984]: I0312 12:29:30.808395 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-2849p" Mar 12 12:29:32.740645 master-0 kubenswrapper[13984]: E0312 12:29:32.740408 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 12 12:29:36.702949 master-0 kubenswrapper[13984]: I0312 12:29:36.702869 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-pqph7" Mar 12 12:29:42.994066 master-0 kubenswrapper[13984]: I0312 12:29:42.993927 13984 status_manager.go:851] "Failed to get status for pod" podUID="29c709c82970b529e7b9b895aa92ef05" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 12 12:29:45.819262 master-0 kubenswrapper[13984]: E0312 12:29:45.819077 13984 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c17bc108ec9ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:27:29.776216554 +0000 UTC m=+181.974232076,LastTimestamp:2026-03-12 12:27:29.776216554 +0000 UTC m=+181.974232076,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:29:46.740807 master-0 kubenswrapper[13984]: E0312 12:29:46.740718 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 12 12:29:48.087205 master-0 kubenswrapper[13984]: I0312 12:29:48.087116 13984 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="dea8140f4ab5b640c5655e5eb4d626e1aebff01f85897d18d941b1d5a236b70d" exitCode=0 Mar 12 12:29:48.087205 master-0 kubenswrapper[13984]: I0312 12:29:48.087162 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"dea8140f4ab5b640c5655e5eb4d626e1aebff01f85897d18d941b1d5a236b70d"} Mar 12 12:29:48.087927 master-0 kubenswrapper[13984]: I0312 12:29:48.087459 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:29:48.087927 master-0 kubenswrapper[13984]: I0312 12:29:48.087473 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:29:49.142072 master-0 kubenswrapper[13984]: E0312 12:29:49.141349 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:29:51.698611 master-0 kubenswrapper[13984]: E0312 12:29:51.698460 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:29:41Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:29:41Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:29:41Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-12T12:29:41Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": context deadline exceeded" Mar 12 12:29:55.144724 master-0 kubenswrapper[13984]: I0312 12:29:55.144669 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/1.log" Mar 12 12:29:55.145348 master-0 kubenswrapper[13984]: I0312 12:29:55.145308 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/0.log" Mar 12 12:29:55.145419 master-0 kubenswrapper[13984]: I0312 12:29:55.145384 13984 generic.go:334] "Generic (PLEG): container finished" podID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" containerID="8ee31e6fba1847a9978774591e03c71aa76ace6195ea06993968ed3542391f09" exitCode=1 Mar 12 12:29:55.145506 master-0 kubenswrapper[13984]: I0312 12:29:55.145444 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerDied","Data":"8ee31e6fba1847a9978774591e03c71aa76ace6195ea06993968ed3542391f09"} Mar 12 12:29:55.146319 master-0 kubenswrapper[13984]: I0312 12:29:55.145547 13984 scope.go:117] "RemoveContainer" containerID="7085fb6154550cf5e6b3b06a657089d39dcfd14f6f8134eed107ad674b6e5b62" Mar 12 12:29:55.146319 master-0 kubenswrapper[13984]: I0312 12:29:55.146063 13984 scope.go:117] "RemoveContainer" containerID="8ee31e6fba1847a9978774591e03c71aa76ace6195ea06993968ed3542391f09" Mar 12 12:29:55.146319 master-0 kubenswrapper[13984]: E0312 12:29:55.146277 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kf7kw_openshift-cluster-storage-operator(7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podUID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" Mar 12 12:29:56.160146 master-0 kubenswrapper[13984]: I0312 12:29:56.159369 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:29:56.160146 master-0 kubenswrapper[13984]: I0312 12:29:56.159437 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="2e1c10ff585ac62d9bd97610178ea53b8be2cf0963171b0a4671fc1da4d09ac1" exitCode=0 Mar 12 12:29:56.160146 master-0 kubenswrapper[13984]: I0312 12:29:56.159522 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerDied","Data":"2e1c10ff585ac62d9bd97610178ea53b8be2cf0963171b0a4671fc1da4d09ac1"} Mar 12 12:29:56.161309 master-0 kubenswrapper[13984]: I0312 12:29:56.161135 13984 scope.go:117] "RemoveContainer" containerID="2e1c10ff585ac62d9bd97610178ea53b8be2cf0963171b0a4671fc1da4d09ac1" Mar 12 12:29:56.164755 master-0 kubenswrapper[13984]: I0312 12:29:56.164737 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/1.log" Mar 12 12:29:57.178992 master-0 kubenswrapper[13984]: I0312 12:29:57.178934 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:29:57.179792 master-0 kubenswrapper[13984]: I0312 12:29:57.179005 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"c55b65128843d977f3bf59904769f6c371b0e047d3aa5afbb7499ba10645c231"} Mar 12 12:29:58.185172 master-0 kubenswrapper[13984]: I0312 12:29:58.185060 13984 generic.go:334] "Generic (PLEG): container finished" podID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerID="cddd485ac51118295f95288e1b47d7560d221f73530c7f79d262c83d31f5faa3" exitCode=0 Mar 12 12:29:58.186091 master-0 kubenswrapper[13984]: I0312 12:29:58.185160 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" event={"ID":"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0","Type":"ContainerDied","Data":"cddd485ac51118295f95288e1b47d7560d221f73530c7f79d262c83d31f5faa3"} Mar 12 12:29:58.186870 master-0 kubenswrapper[13984]: I0312 12:29:58.186845 13984 scope.go:117] "RemoveContainer" containerID="cddd485ac51118295f95288e1b47d7560d221f73530c7f79d262c83d31f5faa3" Mar 12 12:29:58.188387 master-0 kubenswrapper[13984]: I0312 12:29:58.188357 13984 generic.go:334] "Generic (PLEG): container finished" podID="2f3a291a-d9af-4e0f-a307-8928e4dc523d" containerID="bf8bddde515df885e66ccd95c414dbc03cb4e50f31ec3dd72464e9bd5b698d49" exitCode=0 Mar 12 12:29:58.188534 master-0 kubenswrapper[13984]: I0312 12:29:58.188443 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" event={"ID":"2f3a291a-d9af-4e0f-a307-8928e4dc523d","Type":"ContainerDied","Data":"bf8bddde515df885e66ccd95c414dbc03cb4e50f31ec3dd72464e9bd5b698d49"} Mar 12 12:29:58.189282 master-0 kubenswrapper[13984]: I0312 12:29:58.189258 13984 scope.go:117] "RemoveContainer" containerID="bf8bddde515df885e66ccd95c414dbc03cb4e50f31ec3dd72464e9bd5b698d49" Mar 12 12:29:59.198137 master-0 kubenswrapper[13984]: I0312 12:29:59.198092 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" event={"ID":"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0","Type":"ContainerStarted","Data":"fccd1d387d9f834932469bff2fb734f1ceea36c669a79dc7f029e4319a59c287"} Mar 12 12:29:59.199320 master-0 kubenswrapper[13984]: I0312 12:29:59.199285 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:29:59.202695 master-0 kubenswrapper[13984]: I0312 12:29:59.202632 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mlfvv" event={"ID":"2f3a291a-d9af-4e0f-a307-8928e4dc523d","Type":"ContainerStarted","Data":"684dd245a7310137407785a760d2f7bff15258a2860f55907749c3198227e8e1"} Mar 12 12:29:59.205553 master-0 kubenswrapper[13984]: I0312 12:29:59.205518 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:29:59.206196 master-0 kubenswrapper[13984]: I0312 12:29:59.206150 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4_c62edaec-38e2-4b73-8bb5-c776abfb310f/control-plane-machine-set-operator/0.log" Mar 12 12:29:59.206268 master-0 kubenswrapper[13984]: I0312 12:29:59.206228 13984 generic.go:334] "Generic (PLEG): container finished" podID="c62edaec-38e2-4b73-8bb5-c776abfb310f" containerID="3b494a9f1b81fd19a42a7bdbd404289733966fcd678f30d029c5042c6b33a7f8" exitCode=1 Mar 12 12:29:59.206316 master-0 kubenswrapper[13984]: I0312 12:29:59.206267 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" event={"ID":"c62edaec-38e2-4b73-8bb5-c776abfb310f","Type":"ContainerDied","Data":"3b494a9f1b81fd19a42a7bdbd404289733966fcd678f30d029c5042c6b33a7f8"} Mar 12 12:29:59.206848 master-0 kubenswrapper[13984]: I0312 12:29:59.206811 13984 scope.go:117] "RemoveContainer" containerID="3b494a9f1b81fd19a42a7bdbd404289733966fcd678f30d029c5042c6b33a7f8" Mar 12 12:30:00.090572 master-0 kubenswrapper[13984]: I0312 12:30:00.090474 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:30:00.224598 master-0 kubenswrapper[13984]: I0312 12:30:00.224547 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4_c62edaec-38e2-4b73-8bb5-c776abfb310f/control-plane-machine-set-operator/0.log" Mar 12 12:30:00.225052 master-0 kubenswrapper[13984]: I0312 12:30:00.224643 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dnxx4" event={"ID":"c62edaec-38e2-4b73-8bb5-c776abfb310f","Type":"ContainerStarted","Data":"013ed4181fda56bbccfa6d870ef124deda8926c8afc18571026c54dc79f79abc"} Mar 12 12:30:00.434765 master-0 kubenswrapper[13984]: I0312 12:30:00.434636 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:30:01.233145 master-0 kubenswrapper[13984]: I0312 12:30:01.233088 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-nq8zw_2bab9dba-235f-467c-9224-634cca9acbd2/machine-api-operator/0.log" Mar 12 12:30:01.233784 master-0 kubenswrapper[13984]: I0312 12:30:01.233594 13984 generic.go:334] "Generic (PLEG): container finished" podID="2bab9dba-235f-467c-9224-634cca9acbd2" containerID="2ed14edf7421ab677fa9c2cbcfd66d8fefd23747e7948839f3fcbffbd630fad1" exitCode=255 Mar 12 12:30:01.233784 master-0 kubenswrapper[13984]: I0312 12:30:01.233653 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" event={"ID":"2bab9dba-235f-467c-9224-634cca9acbd2","Type":"ContainerDied","Data":"2ed14edf7421ab677fa9c2cbcfd66d8fefd23747e7948839f3fcbffbd630fad1"} Mar 12 12:30:01.234493 master-0 kubenswrapper[13984]: I0312 12:30:01.234439 13984 scope.go:117] "RemoveContainer" containerID="2ed14edf7421ab677fa9c2cbcfd66d8fefd23747e7948839f3fcbffbd630fad1" Mar 12 12:30:01.699603 master-0 kubenswrapper[13984]: E0312 12:30:01.699469 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:02.243720 master-0 kubenswrapper[13984]: I0312 12:30:02.243666 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-nq8zw_2bab9dba-235f-467c-9224-634cca9acbd2/machine-api-operator/0.log" Mar 12 12:30:02.244177 master-0 kubenswrapper[13984]: I0312 12:30:02.244075 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-nq8zw" event={"ID":"2bab9dba-235f-467c-9224-634cca9acbd2","Type":"ContainerStarted","Data":"56b69466460f08817f9b86d4ab052249b9b23f75c1955972d524394831d3b7d8"} Mar 12 12:30:03.435769 master-0 kubenswrapper[13984]: I0312 12:30:03.435715 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:30:03.436428 master-0 kubenswrapper[13984]: I0312 12:30:03.436397 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:04.262685 master-0 kubenswrapper[13984]: I0312 12:30:04.262650 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-2hk7d_6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/machine-approver-controller/0.log" Mar 12 12:30:04.263646 master-0 kubenswrapper[13984]: I0312 12:30:04.263609 13984 generic.go:334] "Generic (PLEG): container finished" podID="6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc" containerID="17336bfb565956ddc39a230d0296c19707c7961d1004d9cb08cfd084b37df4ac" exitCode=255 Mar 12 12:30:04.263824 master-0 kubenswrapper[13984]: I0312 12:30:04.263678 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" event={"ID":"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc","Type":"ContainerDied","Data":"17336bfb565956ddc39a230d0296c19707c7961d1004d9cb08cfd084b37df4ac"} Mar 12 12:30:04.264625 master-0 kubenswrapper[13984]: I0312 12:30:04.264602 13984 scope.go:117] "RemoveContainer" containerID="17336bfb565956ddc39a230d0296c19707c7961d1004d9cb08cfd084b37df4ac" Mar 12 12:30:04.266311 master-0 kubenswrapper[13984]: I0312 12:30:04.266274 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-tcc85_d1d16bbc-778b-4fc1-abb2-b43e79a7c532/package-server-manager/0.log" Mar 12 12:30:04.266743 master-0 kubenswrapper[13984]: I0312 12:30:04.266708 13984 generic.go:334] "Generic (PLEG): container finished" podID="d1d16bbc-778b-4fc1-abb2-b43e79a7c532" containerID="392bacabbad99777b25b231313d0a57eab1ab274431a1195e628095a52c28f22" exitCode=1 Mar 12 12:30:04.266861 master-0 kubenswrapper[13984]: I0312 12:30:04.266749 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" event={"ID":"d1d16bbc-778b-4fc1-abb2-b43e79a7c532","Type":"ContainerDied","Data":"392bacabbad99777b25b231313d0a57eab1ab274431a1195e628095a52c28f22"} Mar 12 12:30:04.267320 master-0 kubenswrapper[13984]: I0312 12:30:04.267280 13984 scope.go:117] "RemoveContainer" containerID="392bacabbad99777b25b231313d0a57eab1ab274431a1195e628095a52c28f22" Mar 12 12:30:05.275198 master-0 kubenswrapper[13984]: I0312 12:30:05.275131 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-2hk7d_6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc/machine-approver-controller/0.log" Mar 12 12:30:05.275856 master-0 kubenswrapper[13984]: I0312 12:30:05.275786 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-2hk7d" event={"ID":"6243e45c-6e83-4fe0-b619-f7bf9e5d4dbc","Type":"ContainerStarted","Data":"40e51cd9e873f4002b7f66f0703f4ad98388266a368cf8fa889eec8523272b97"} Mar 12 12:30:05.277945 master-0 kubenswrapper[13984]: I0312 12:30:05.277909 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-tcc85_d1d16bbc-778b-4fc1-abb2-b43e79a7c532/package-server-manager/0.log" Mar 12 12:30:05.278342 master-0 kubenswrapper[13984]: I0312 12:30:05.278289 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" event={"ID":"d1d16bbc-778b-4fc1-abb2-b43e79a7c532","Type":"ContainerStarted","Data":"1d6c190e3b5e227cfd82c22b95f2dea31dce7c582eccf7a6e1f0d61033769992"} Mar 12 12:30:05.278701 master-0 kubenswrapper[13984]: I0312 12:30:05.278656 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:30:06.143830 master-0 kubenswrapper[13984]: E0312 12:30:06.143444 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:30:10.980081 master-0 kubenswrapper[13984]: I0312 12:30:10.979974 13984 scope.go:117] "RemoveContainer" containerID="8ee31e6fba1847a9978774591e03c71aa76ace6195ea06993968ed3542391f09" Mar 12 12:30:11.336767 master-0 kubenswrapper[13984]: I0312 12:30:11.336601 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/1.log" Mar 12 12:30:11.336767 master-0 kubenswrapper[13984]: I0312 12:30:11.336695 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerStarted","Data":"00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c"} Mar 12 12:30:11.700564 master-0 kubenswrapper[13984]: E0312 12:30:11.700444 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:13.434719 master-0 kubenswrapper[13984]: I0312 12:30:13.434608 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:30:13.435312 master-0 kubenswrapper[13984]: I0312 12:30:13.434745 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:16.384111 master-0 kubenswrapper[13984]: I0312 12:30:16.384039 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/3.log" Mar 12 12:30:16.385760 master-0 kubenswrapper[13984]: I0312 12:30:16.385717 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/2.log" Mar 12 12:30:16.386655 master-0 kubenswrapper[13984]: I0312 12:30:16.386590 13984 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad" exitCode=1 Mar 12 12:30:16.386814 master-0 kubenswrapper[13984]: I0312 12:30:16.386656 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerDied","Data":"45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad"} Mar 12 12:30:16.386814 master-0 kubenswrapper[13984]: I0312 12:30:16.386738 13984 scope.go:117] "RemoveContainer" containerID="c7a8cac38562711df5097b3495b3fcbf36ebc31ee34e5cfed0a5c587cefecb36" Mar 12 12:30:16.387392 master-0 kubenswrapper[13984]: I0312 12:30:16.387350 13984 scope.go:117] "RemoveContainer" containerID="45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad" Mar 12 12:30:16.387924 master-0 kubenswrapper[13984]: E0312 12:30:16.387848 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:30:17.401840 master-0 kubenswrapper[13984]: I0312 12:30:17.401742 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/3.log" Mar 12 12:30:19.823585 master-0 kubenswrapper[13984]: E0312 12:30:19.823363 13984 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{prometheus-k8s-0.189c17add172716e openshift-monitoring 12382 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:prometheus-k8s-0,UID:d30590a1-9d92-4347-84ae-fffc821e6a57,APIVersion:v1,ResourceVersion:12155,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"prometheus-trusted-ca-bundle\" : configmap references non-existent config key: ca-bundle.crt,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:26:28 +0000 UTC,LastTimestamp:2026-03-12 12:27:32.478287417 +0000 UTC m=+184.676302909,Count:8,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 12 12:30:21.700869 master-0 kubenswrapper[13984]: E0312 12:30:21.700708 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded" Mar 12 12:30:22.089797 master-0 kubenswrapper[13984]: E0312 12:30:22.089624 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 12 12:30:22.454324 master-0 kubenswrapper[13984]: I0312 12:30:22.454280 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"0e074efc91d3148be657d3fc2972d0c5ec628b26fddecf9acc7ae4f5db91f953"} Mar 12 12:30:23.144373 master-0 kubenswrapper[13984]: E0312 12:30:23.144259 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 12 12:30:23.434082 master-0 kubenswrapper[13984]: I0312 12:30:23.433942 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:30:23.434294 master-0 kubenswrapper[13984]: I0312 12:30:23.434078 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:23.434294 master-0 kubenswrapper[13984]: I0312 12:30:23.434150 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:30:23.435343 master-0 kubenswrapper[13984]: I0312 12:30:23.435238 13984 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"c55b65128843d977f3bf59904769f6c371b0e047d3aa5afbb7499ba10645c231"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 12:30:23.435422 master-0 kubenswrapper[13984]: I0312 12:30:23.435404 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" containerID="cri-o://c55b65128843d977f3bf59904769f6c371b0e047d3aa5afbb7499ba10645c231" gracePeriod=30 Mar 12 12:30:23.468089 master-0 kubenswrapper[13984]: I0312 12:30:23.468011 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"297489994d59cdccf817f8b480ddae27148958fe87ba308a63937fd261486699"} Mar 12 12:30:23.468089 master-0 kubenswrapper[13984]: I0312 12:30:23.468068 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"51e79a319275f7b3c43f3624523caba1f98c516108312902336c2704cc74f945"} Mar 12 12:30:23.468089 master-0 kubenswrapper[13984]: I0312 12:30:23.468095 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"6b48e0f5845693d718710436efc1d8174e23eef6117d9cec8bb9260b8474fee4"} Mar 12 12:30:24.485603 master-0 kubenswrapper[13984]: I0312 12:30:24.485472 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"161581d578964105cff4cada2bd7a0a8dbcaa9fbbebd3581d5afa24992598d3b"} Mar 12 12:30:24.486526 master-0 kubenswrapper[13984]: I0312 12:30:24.485891 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:30:24.486526 master-0 kubenswrapper[13984]: I0312 12:30:24.485935 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:30:24.490722 master-0 kubenswrapper[13984]: I0312 12:30:24.490641 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/1.log" Mar 12 12:30:24.495965 master-0 kubenswrapper[13984]: I0312 12:30:24.495915 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:30:24.496094 master-0 kubenswrapper[13984]: I0312 12:30:24.496023 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="c55b65128843d977f3bf59904769f6c371b0e047d3aa5afbb7499ba10645c231" exitCode=255 Mar 12 12:30:24.496182 master-0 kubenswrapper[13984]: I0312 12:30:24.496099 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerDied","Data":"c55b65128843d977f3bf59904769f6c371b0e047d3aa5afbb7499ba10645c231"} Mar 12 12:30:24.496182 master-0 kubenswrapper[13984]: I0312 12:30:24.496138 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"2f2f9d1efef11c9442b86fd7d08966fa1428bb49103fdc16da0e4045620a41cf"} Mar 12 12:30:24.496312 master-0 kubenswrapper[13984]: I0312 12:30:24.496198 13984 scope.go:117] "RemoveContainer" containerID="2e1c10ff585ac62d9bd97610178ea53b8be2cf0963171b0a4671fc1da4d09ac1" Mar 12 12:30:25.509617 master-0 kubenswrapper[13984]: I0312 12:30:25.509560 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/1.log" Mar 12 12:30:25.512681 master-0 kubenswrapper[13984]: I0312 12:30:25.512632 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:30:27.980015 master-0 kubenswrapper[13984]: I0312 12:30:27.979934 13984 scope.go:117] "RemoveContainer" containerID="45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad" Mar 12 12:30:27.980801 master-0 kubenswrapper[13984]: E0312 12:30:27.980160 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:30:28.019683 master-0 kubenswrapper[13984]: I0312 12:30:28.019587 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 12 12:30:28.019683 master-0 kubenswrapper[13984]: I0312 12:30:28.019662 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 12 12:30:29.365420 master-0 kubenswrapper[13984]: E0312 12:30:29.365299 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" Mar 12 12:30:29.569326 master-0 kubenswrapper[13984]: I0312 12:30:29.569228 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:30:30.090415 master-0 kubenswrapper[13984]: I0312 12:30:30.090304 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:30:30.434375 master-0 kubenswrapper[13984]: I0312 12:30:30.434223 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:30:31.701674 master-0 kubenswrapper[13984]: E0312 12:30:31.701604 13984 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:31.701674 master-0 kubenswrapper[13984]: E0312 12:30:31.701647 13984 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 12 12:30:33.058000 master-0 kubenswrapper[13984]: I0312 12:30:33.057868 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:30:33.058986 master-0 kubenswrapper[13984]: E0312 12:30:33.058118 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle podName:41e53fa8-31cb-44a9-9411-8ce2df26b156 nodeName:}" failed. No retries permitted until 2026-03-12 12:32:35.058089636 +0000 UTC m=+487.256105168 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:30:33.435279 master-0 kubenswrapper[13984]: I0312 12:30:33.435067 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:30:33.436201 master-0 kubenswrapper[13984]: I0312 12:30:33.436146 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:34.402459 master-0 kubenswrapper[13984]: E0312 12:30:34.402379 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" Mar 12 12:30:34.605164 master-0 kubenswrapper[13984]: I0312 12:30:34.605089 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:30:38.048159 master-0 kubenswrapper[13984]: I0312 12:30:38.048082 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 12 12:30:38.547311 master-0 kubenswrapper[13984]: I0312 12:30:38.547204 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:30:38.547714 master-0 kubenswrapper[13984]: E0312 12:30:38.547514 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle podName:d30590a1-9d92-4347-84ae-fffc821e6a57 nodeName:}" failed. No retries permitted until 2026-03-12 12:32:40.547460875 +0000 UTC m=+492.745476377 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:30:40.146588 master-0 kubenswrapper[13984]: E0312 12:30:40.146521 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:30:41.334045 master-0 kubenswrapper[13984]: I0312 12:30:41.333994 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-tcc85" Mar 12 12:30:41.660820 master-0 kubenswrapper[13984]: I0312 12:30:41.660678 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/2.log" Mar 12 12:30:41.661414 master-0 kubenswrapper[13984]: I0312 12:30:41.661360 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/1.log" Mar 12 12:30:41.661562 master-0 kubenswrapper[13984]: I0312 12:30:41.661425 13984 generic.go:334] "Generic (PLEG): container finished" podID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" containerID="00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c" exitCode=1 Mar 12 12:30:41.661562 master-0 kubenswrapper[13984]: I0312 12:30:41.661475 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerDied","Data":"00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c"} Mar 12 12:30:41.661703 master-0 kubenswrapper[13984]: I0312 12:30:41.661569 13984 scope.go:117] "RemoveContainer" containerID="8ee31e6fba1847a9978774591e03c71aa76ace6195ea06993968ed3542391f09" Mar 12 12:30:41.665209 master-0 kubenswrapper[13984]: I0312 12:30:41.665147 13984 scope.go:117] "RemoveContainer" containerID="00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c" Mar 12 12:30:41.665703 master-0 kubenswrapper[13984]: E0312 12:30:41.665652 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kf7kw_openshift-cluster-storage-operator(7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podUID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" Mar 12 12:30:42.678830 master-0 kubenswrapper[13984]: I0312 12:30:42.678740 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/2.log" Mar 12 12:30:42.980429 master-0 kubenswrapper[13984]: I0312 12:30:42.980351 13984 scope.go:117] "RemoveContainer" containerID="45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad" Mar 12 12:30:42.995205 master-0 kubenswrapper[13984]: I0312 12:30:42.995137 13984 status_manager.go:851] "Failed to get status for pod" podUID="51ef1ec5-3e17-485a-9797-566ef207fa0a" pod="openshift-etcd/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Mar 12 12:30:43.042584 master-0 kubenswrapper[13984]: I0312 12:30:43.042535 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 12 12:30:43.435150 master-0 kubenswrapper[13984]: I0312 12:30:43.434983 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:30:43.435150 master-0 kubenswrapper[13984]: I0312 12:30:43.435092 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:43.687551 master-0 kubenswrapper[13984]: I0312 12:30:43.687415 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/3.log" Mar 12 12:30:43.688113 master-0 kubenswrapper[13984]: I0312 12:30:43.687903 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerStarted","Data":"a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621"} Mar 12 12:30:49.345258 master-0 kubenswrapper[13984]: I0312 12:30:49.345064 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: I0312 12:30:49.345715 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: E0312 12:30:49.345394 13984 projected.go:288] Couldn't get configMap openshift-kube-controller-manager/kube-root-ca.crt: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: E0312 12:30:49.345800 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-controller-manager/installer-2-master-0: object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: E0312 12:30:49.345867 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access podName:a97fcd56-aa52-414a-b370-154c1b34c1ed nodeName:}" failed. No retries permitted until 2026-03-12 12:32:51.345842932 +0000 UTC m=+503.543858444 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access") pod "installer-2-master-0" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed") : object "openshift-kube-controller-manager"/"kube-root-ca.crt" not registered Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: E0312 12:30:49.345977 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: E0312 12:30:49.346013 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:30:49.346568 master-0 kubenswrapper[13984]: E0312 12:30:49.346091 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:32:51.346072719 +0000 UTC m=+503.544088241 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:30:53.434748 master-0 kubenswrapper[13984]: I0312 12:30:53.434639 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:30:53.435566 master-0 kubenswrapper[13984]: I0312 12:30:53.434759 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:30:53.435566 master-0 kubenswrapper[13984]: I0312 12:30:53.434831 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:30:53.436017 master-0 kubenswrapper[13984]: I0312 12:30:53.435969 13984 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"2f2f9d1efef11c9442b86fd7d08966fa1428bb49103fdc16da0e4045620a41cf"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 12:30:53.436177 master-0 kubenswrapper[13984]: I0312 12:30:53.436116 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" containerID="cri-o://2f2f9d1efef11c9442b86fd7d08966fa1428bb49103fdc16da0e4045620a41cf" gracePeriod=30 Mar 12 12:30:53.787143 master-0 kubenswrapper[13984]: I0312 12:30:53.787098 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/2.log" Mar 12 12:30:53.787583 master-0 kubenswrapper[13984]: I0312 12:30:53.787560 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/1.log" Mar 12 12:30:53.789195 master-0 kubenswrapper[13984]: I0312 12:30:53.788986 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:30:53.789195 master-0 kubenswrapper[13984]: I0312 12:30:53.789021 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="2f2f9d1efef11c9442b86fd7d08966fa1428bb49103fdc16da0e4045620a41cf" exitCode=255 Mar 12 12:30:53.789195 master-0 kubenswrapper[13984]: I0312 12:30:53.789047 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerDied","Data":"2f2f9d1efef11c9442b86fd7d08966fa1428bb49103fdc16da0e4045620a41cf"} Mar 12 12:30:53.789195 master-0 kubenswrapper[13984]: I0312 12:30:53.789077 13984 scope.go:117] "RemoveContainer" containerID="c55b65128843d977f3bf59904769f6c371b0e047d3aa5afbb7499ba10645c231" Mar 12 12:30:54.798721 master-0 kubenswrapper[13984]: I0312 12:30:54.798659 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/2.log" Mar 12 12:30:54.800263 master-0 kubenswrapper[13984]: I0312 12:30:54.800209 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:30:54.800470 master-0 kubenswrapper[13984]: I0312 12:30:54.800272 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4"} Mar 12 12:30:56.979813 master-0 kubenswrapper[13984]: I0312 12:30:56.979761 13984 scope.go:117] "RemoveContainer" containerID="00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c" Mar 12 12:30:56.980528 master-0 kubenswrapper[13984]: E0312 12:30:56.980142 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kf7kw_openshift-cluster-storage-operator(7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podUID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" Mar 12 12:30:57.148929 master-0 kubenswrapper[13984]: E0312 12:30:57.148514 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:30:58.489693 master-0 kubenswrapper[13984]: E0312 12:30:58.489590 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 12 12:30:58.829409 master-0 kubenswrapper[13984]: I0312 12:30:58.829250 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:30:58.829409 master-0 kubenswrapper[13984]: I0312 12:30:58.829300 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:31:00.091094 master-0 kubenswrapper[13984]: I0312 12:31:00.090962 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:31:00.434395 master-0 kubenswrapper[13984]: I0312 12:31:00.434204 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:31:03.434892 master-0 kubenswrapper[13984]: I0312 12:31:03.434760 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:31:03.435908 master-0 kubenswrapper[13984]: I0312 12:31:03.434899 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:31:05.652143 master-0 kubenswrapper[13984]: E0312 12:31:05.652041 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" podUID="c424f946-e2fe-4450-816b-b79640269ff5" Mar 12 12:31:05.888366 master-0 kubenswrapper[13984]: I0312 12:31:05.888289 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:31:08.254439 master-0 kubenswrapper[13984]: I0312 12:31:08.254363 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:31:08.255058 master-0 kubenswrapper[13984]: E0312 12:31:08.254602 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca podName:c424f946-e2fe-4450-816b-b79640269ff5 nodeName:}" failed. No retries permitted until 2026-03-12 12:33:10.254583217 +0000 UTC m=+522.452598709 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca") pod "console-operator-6c7fb6b958-rnnjn" (UID: "c424f946-e2fe-4450-816b-b79640269ff5") : configmap references non-existent config key: ca-bundle.crt Mar 12 12:31:08.980167 master-0 kubenswrapper[13984]: I0312 12:31:08.980087 13984 scope.go:117] "RemoveContainer" containerID="00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c" Mar 12 12:31:09.916764 master-0 kubenswrapper[13984]: I0312 12:31:09.916685 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/2.log" Mar 12 12:31:09.916764 master-0 kubenswrapper[13984]: I0312 12:31:09.916767 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerStarted","Data":"cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0"} Mar 12 12:31:13.435431 master-0 kubenswrapper[13984]: I0312 12:31:13.435364 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:31:13.436348 master-0 kubenswrapper[13984]: I0312 12:31:13.436303 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:31:14.150119 master-0 kubenswrapper[13984]: E0312 12:31:14.150002 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:31:23.435386 master-0 kubenswrapper[13984]: I0312 12:31:23.435294 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 12 12:31:23.435386 master-0 kubenswrapper[13984]: I0312 12:31:23.435354 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 12 12:31:23.435386 master-0 kubenswrapper[13984]: I0312 12:31:23.435401 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:31:23.436546 master-0 kubenswrapper[13984]: I0312 12:31:23.436084 13984 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 12 12:31:23.436546 master-0 kubenswrapper[13984]: I0312 12:31:23.436165 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" containerID="cri-o://ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" gracePeriod=30 Mar 12 12:31:23.560241 master-0 kubenswrapper[13984]: E0312 12:31:23.560190 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(161fce36d846c7ce98305d8ed6c23827)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" Mar 12 12:31:24.022768 master-0 kubenswrapper[13984]: I0312 12:31:24.022674 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/3.log" Mar 12 12:31:24.023397 master-0 kubenswrapper[13984]: I0312 12:31:24.023332 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/2.log" Mar 12 12:31:24.025433 master-0 kubenswrapper[13984]: I0312 12:31:24.025393 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:31:24.025574 master-0 kubenswrapper[13984]: I0312 12:31:24.025451 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" exitCode=255 Mar 12 12:31:24.025574 master-0 kubenswrapper[13984]: I0312 12:31:24.025535 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerDied","Data":"ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4"} Mar 12 12:31:24.025696 master-0 kubenswrapper[13984]: I0312 12:31:24.025633 13984 scope.go:117] "RemoveContainer" containerID="2f2f9d1efef11c9442b86fd7d08966fa1428bb49103fdc16da0e4045620a41cf" Mar 12 12:31:24.026282 master-0 kubenswrapper[13984]: I0312 12:31:24.026229 13984 scope.go:117] "RemoveContainer" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" Mar 12 12:31:24.026639 master-0 kubenswrapper[13984]: E0312 12:31:24.026604 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(161fce36d846c7ce98305d8ed6c23827)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" Mar 12 12:31:25.035667 master-0 kubenswrapper[13984]: I0312 12:31:25.035612 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/3.log" Mar 12 12:31:25.037465 master-0 kubenswrapper[13984]: I0312 12:31:25.037416 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:31:31.151463 master-0 kubenswrapper[13984]: E0312 12:31:31.151365 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:31:31.754546 master-0 kubenswrapper[13984]: I0312 12:31:31.754448 13984 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:31:31.755284 master-0 kubenswrapper[13984]: I0312 12:31:31.755246 13984 scope.go:117] "RemoveContainer" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" Mar 12 12:31:31.755643 master-0 kubenswrapper[13984]: E0312 12:31:31.755608 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(161fce36d846c7ce98305d8ed6c23827)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" Mar 12 12:31:32.833031 master-0 kubenswrapper[13984]: E0312 12:31:32.832943 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 12 12:31:40.154554 master-0 kubenswrapper[13984]: I0312 12:31:40.154523 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/3.log" Mar 12 12:31:40.155411 master-0 kubenswrapper[13984]: I0312 12:31:40.155383 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/2.log" Mar 12 12:31:40.155490 master-0 kubenswrapper[13984]: I0312 12:31:40.155433 13984 generic.go:334] "Generic (PLEG): container finished" podID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" containerID="cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0" exitCode=1 Mar 12 12:31:40.155532 master-0 kubenswrapper[13984]: I0312 12:31:40.155467 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerDied","Data":"cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0"} Mar 12 12:31:40.155532 master-0 kubenswrapper[13984]: I0312 12:31:40.155520 13984 scope.go:117] "RemoveContainer" containerID="00b4d1989b1e7b331ff7b4f14f7676a7481f98eada54c4b775d0df18ddb8863c" Mar 12 12:31:40.156549 master-0 kubenswrapper[13984]: I0312 12:31:40.156323 13984 scope.go:117] "RemoveContainer" containerID="cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0" Mar 12 12:31:40.157105 master-0 kubenswrapper[13984]: E0312 12:31:40.156740 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kf7kw_openshift-cluster-storage-operator(7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podUID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" Mar 12 12:31:41.164693 master-0 kubenswrapper[13984]: I0312 12:31:41.164573 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/3.log" Mar 12 12:31:42.996769 master-0 kubenswrapper[13984]: I0312 12:31:42.996718 13984 status_manager.go:851] "Failed to get status for pod" podUID="a1a56802af72ce1aac6b5077f1695ac0" pod="kube-system/bootstrap-kube-scheduler-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods bootstrap-kube-scheduler-master-0)" Mar 12 12:31:44.193574 master-0 kubenswrapper[13984]: I0312 12:31:44.193446 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/4.log" Mar 12 12:31:44.195266 master-0 kubenswrapper[13984]: I0312 12:31:44.195184 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/3.log" Mar 12 12:31:44.196066 master-0 kubenswrapper[13984]: I0312 12:31:44.195990 13984 generic.go:334] "Generic (PLEG): container finished" podID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" containerID="a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621" exitCode=1 Mar 12 12:31:44.196157 master-0 kubenswrapper[13984]: I0312 12:31:44.196084 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerDied","Data":"a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621"} Mar 12 12:31:44.196157 master-0 kubenswrapper[13984]: I0312 12:31:44.196148 13984 scope.go:117] "RemoveContainer" containerID="45a3f6cede6c52f3c09a792270927a52a9ce9fba7d97a6e7da8f8579cfb8d0ad" Mar 12 12:31:44.197515 master-0 kubenswrapper[13984]: I0312 12:31:44.197433 13984 scope.go:117] "RemoveContainer" containerID="a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621" Mar 12 12:31:44.198118 master-0 kubenswrapper[13984]: E0312 12:31:44.198048 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:31:44.980556 master-0 kubenswrapper[13984]: I0312 12:31:44.980462 13984 scope.go:117] "RemoveContainer" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" Mar 12 12:31:44.980877 master-0 kubenswrapper[13984]: E0312 12:31:44.980739 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(161fce36d846c7ce98305d8ed6c23827)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" Mar 12 12:31:45.206826 master-0 kubenswrapper[13984]: I0312 12:31:45.206712 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/4.log" Mar 12 12:31:48.153517 master-0 kubenswrapper[13984]: E0312 12:31:48.153389 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:31:54.981648 master-0 kubenswrapper[13984]: I0312 12:31:54.981522 13984 scope.go:117] "RemoveContainer" containerID="cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0" Mar 12 12:31:54.982568 master-0 kubenswrapper[13984]: E0312 12:31:54.981864 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kf7kw_openshift-cluster-storage-operator(7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podUID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" Mar 12 12:31:54.982784 master-0 kubenswrapper[13984]: I0312 12:31:54.982734 13984 scope.go:117] "RemoveContainer" containerID="a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621" Mar 12 12:31:54.983122 master-0 kubenswrapper[13984]: E0312 12:31:54.983084 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:31:55.982381 master-0 kubenswrapper[13984]: I0312 12:31:55.982306 13984 scope.go:117] "RemoveContainer" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" Mar 12 12:31:55.983046 master-0 kubenswrapper[13984]: E0312 12:31:55.982569 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(161fce36d846c7ce98305d8ed6c23827)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" Mar 12 12:32:05.154814 master-0 kubenswrapper[13984]: E0312 12:32:05.154589 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 12 12:32:06.004669 master-0 kubenswrapper[13984]: I0312 12:32:06.004621 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="531a537f715644fc0623e243f05c5d1fb97e1cf2fd31e77e02a07259ef5d606f" exitCode=0 Mar 12 12:32:06.004669 master-0 kubenswrapper[13984]: I0312 12:32:06.004671 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"531a537f715644fc0623e243f05c5d1fb97e1cf2fd31e77e02a07259ef5d606f"} Mar 12 12:32:06.005327 master-0 kubenswrapper[13984]: I0312 12:32:06.005296 13984 scope.go:117] "RemoveContainer" containerID="531a537f715644fc0623e243f05c5d1fb97e1cf2fd31e77e02a07259ef5d606f" Mar 12 12:32:06.980356 master-0 kubenswrapper[13984]: I0312 12:32:06.980232 13984 scope.go:117] "RemoveContainer" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" Mar 12 12:32:07.020064 master-0 kubenswrapper[13984]: I0312 12:32:07.020010 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"545ce6883090e4ecdcea7622975d3e1147cda32524b819153cd3b1df99e6ad16"} Mar 12 12:32:08.034938 master-0 kubenswrapper[13984]: I0312 12:32:08.034856 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/3.log" Mar 12 12:32:08.037854 master-0 kubenswrapper[13984]: I0312 12:32:08.037820 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:32:08.037951 master-0 kubenswrapper[13984]: I0312 12:32:08.037905 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"161fce36d846c7ce98305d8ed6c23827","Type":"ContainerStarted","Data":"37ce22b830e08d23df66196c50da46b1d74b7f16a0a3129542989b5d4bdc3dd3"} Mar 12 12:32:08.166813 master-0 kubenswrapper[13984]: I0312 12:32:08.165796 13984 patch_prober.go:28] interesting pod/route-controller-manager-7665b44c8d-2lgnf container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.47:8443/healthz\": dial tcp 10.128.0.47:8443: connect: connection refused" start-of-body= Mar 12 12:32:08.166813 master-0 kubenswrapper[13984]: I0312 12:32:08.165860 13984 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.47:8443/healthz\": dial tcp 10.128.0.47:8443: connect: connection refused" Mar 12 12:32:08.980158 master-0 kubenswrapper[13984]: I0312 12:32:08.980073 13984 scope.go:117] "RemoveContainer" containerID="a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621" Mar 12 12:32:08.980470 master-0 kubenswrapper[13984]: E0312 12:32:08.980305 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:32:09.047815 master-0 kubenswrapper[13984]: I0312 12:32:09.047751 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-9vtjp_b9194868-75ce-4138-a9d4-ddd64660c529/cluster-node-tuning-operator/0.log" Mar 12 12:32:09.048288 master-0 kubenswrapper[13984]: I0312 12:32:09.047840 13984 generic.go:334] "Generic (PLEG): container finished" podID="b9194868-75ce-4138-a9d4-ddd64660c529" containerID="6ff152a8cc30b3ca7b40abe31bb7a67d4765a151cbd91ac76f3274a4601e4bc4" exitCode=1 Mar 12 12:32:09.048288 master-0 kubenswrapper[13984]: I0312 12:32:09.047955 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" event={"ID":"b9194868-75ce-4138-a9d4-ddd64660c529","Type":"ContainerDied","Data":"6ff152a8cc30b3ca7b40abe31bb7a67d4765a151cbd91ac76f3274a4601e4bc4"} Mar 12 12:32:09.048912 master-0 kubenswrapper[13984]: I0312 12:32:09.048808 13984 scope.go:117] "RemoveContainer" containerID="6ff152a8cc30b3ca7b40abe31bb7a67d4765a151cbd91ac76f3274a4601e4bc4" Mar 12 12:32:09.051209 master-0 kubenswrapper[13984]: I0312 12:32:09.051162 13984 generic.go:334] "Generic (PLEG): container finished" podID="f3b704e7-1291-4645-8a0d-2a937829d7ac" containerID="43680f7bbccbe0e894d252b9c7302ee1699f682897d030495b133d3d14f28d18" exitCode=0 Mar 12 12:32:09.051378 master-0 kubenswrapper[13984]: I0312 12:32:09.051283 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" event={"ID":"f3b704e7-1291-4645-8a0d-2a937829d7ac","Type":"ContainerDied","Data":"43680f7bbccbe0e894d252b9c7302ee1699f682897d030495b133d3d14f28d18"} Mar 12 12:32:09.051980 master-0 kubenswrapper[13984]: I0312 12:32:09.051935 13984 scope.go:117] "RemoveContainer" containerID="43680f7bbccbe0e894d252b9c7302ee1699f682897d030495b133d3d14f28d18" Mar 12 12:32:09.054822 master-0 kubenswrapper[13984]: I0312 12:32:09.054759 13984 generic.go:334] "Generic (PLEG): container finished" podID="cfd178d7-f518-413b-95ab-ab6687be6e0f" containerID="c725e2adf84dc5ad7765bcbefc340bfdc4d1be5e28386cdc2ec851ee8c8574d8" exitCode=0 Mar 12 12:32:09.054997 master-0 kubenswrapper[13984]: I0312 12:32:09.054810 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" event={"ID":"cfd178d7-f518-413b-95ab-ab6687be6e0f","Type":"ContainerDied","Data":"c725e2adf84dc5ad7765bcbefc340bfdc4d1be5e28386cdc2ec851ee8c8574d8"} Mar 12 12:32:09.056161 master-0 kubenswrapper[13984]: I0312 12:32:09.056134 13984 scope.go:117] "RemoveContainer" containerID="c725e2adf84dc5ad7765bcbefc340bfdc4d1be5e28386cdc2ec851ee8c8574d8" Mar 12 12:32:09.058662 master-0 kubenswrapper[13984]: I0312 12:32:09.058602 13984 generic.go:334] "Generic (PLEG): container finished" podID="99a11fe6-48a1-439e-b788-158dbe267dcd" containerID="1bc386ffa63a2ec11752a9c7ca7b3a646adea9e87a3d6e20220e3af8fe3c25ee" exitCode=0 Mar 12 12:32:09.058662 master-0 kubenswrapper[13984]: I0312 12:32:09.058647 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" event={"ID":"99a11fe6-48a1-439e-b788-158dbe267dcd","Type":"ContainerDied","Data":"1bc386ffa63a2ec11752a9c7ca7b3a646adea9e87a3d6e20220e3af8fe3c25ee"} Mar 12 12:32:09.059224 master-0 kubenswrapper[13984]: I0312 12:32:09.059180 13984 scope.go:117] "RemoveContainer" containerID="1bc386ffa63a2ec11752a9c7ca7b3a646adea9e87a3d6e20220e3af8fe3c25ee" Mar 12 12:32:09.062876 master-0 kubenswrapper[13984]: I0312 12:32:09.062792 13984 generic.go:334] "Generic (PLEG): container finished" podID="a154f648-b96d-449e-b0f5-ba32266000c2" containerID="ba12b4f3c2b615d1e6edfe9917ea5510fedb4efbe7773f21a5c975a8c101f165" exitCode=0 Mar 12 12:32:09.063021 master-0 kubenswrapper[13984]: I0312 12:32:09.062925 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerDied","Data":"ba12b4f3c2b615d1e6edfe9917ea5510fedb4efbe7773f21a5c975a8c101f165"} Mar 12 12:32:09.063021 master-0 kubenswrapper[13984]: I0312 12:32:09.062974 13984 scope.go:117] "RemoveContainer" containerID="962d17c59c83032444283250e4b6771dc0674c33ed005bbb62fede3e00c87666" Mar 12 12:32:09.063708 master-0 kubenswrapper[13984]: I0312 12:32:09.063660 13984 scope.go:117] "RemoveContainer" containerID="ba12b4f3c2b615d1e6edfe9917ea5510fedb4efbe7773f21a5c975a8c101f165" Mar 12 12:32:09.067294 master-0 kubenswrapper[13984]: I0312 12:32:09.067192 13984 generic.go:334] "Generic (PLEG): container finished" podID="0aeeef2a-f9df-4f87-b985-bd1da94c76c3" containerID="13d6e3d4ec61f1e849f9a3f93e128e87d996c76cefb71ba46af5b1c143ef3967" exitCode=0 Mar 12 12:32:09.067659 master-0 kubenswrapper[13984]: I0312 12:32:09.067318 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" event={"ID":"0aeeef2a-f9df-4f87-b985-bd1da94c76c3","Type":"ContainerDied","Data":"13d6e3d4ec61f1e849f9a3f93e128e87d996c76cefb71ba46af5b1c143ef3967"} Mar 12 12:32:09.068147 master-0 kubenswrapper[13984]: I0312 12:32:09.068098 13984 scope.go:117] "RemoveContainer" containerID="13d6e3d4ec61f1e849f9a3f93e128e87d996c76cefb71ba46af5b1c143ef3967" Mar 12 12:32:09.071000 master-0 kubenswrapper[13984]: I0312 12:32:09.070914 13984 generic.go:334] "Generic (PLEG): container finished" podID="15bf86d9-62b3-4af8-b6f6-23131d712332" containerID="dccd5d7d429a30ef17453995a9a86b8815c101400dca3ce09b7103fbd7a3f58c" exitCode=0 Mar 12 12:32:09.071218 master-0 kubenswrapper[13984]: I0312 12:32:09.071014 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" event={"ID":"15bf86d9-62b3-4af8-b6f6-23131d712332","Type":"ContainerDied","Data":"dccd5d7d429a30ef17453995a9a86b8815c101400dca3ce09b7103fbd7a3f58c"} Mar 12 12:32:09.071917 master-0 kubenswrapper[13984]: I0312 12:32:09.071860 13984 scope.go:117] "RemoveContainer" containerID="dccd5d7d429a30ef17453995a9a86b8815c101400dca3ce09b7103fbd7a3f58c" Mar 12 12:32:09.074879 master-0 kubenswrapper[13984]: I0312 12:32:09.074585 13984 generic.go:334] "Generic (PLEG): container finished" podID="5a012d0b-d1a8-4cd3-8b91-b346d0445f24" containerID="34dc0209ff1d7a04ba94a94ac3d8ef4ae5cf2a5dd4532cb0a823e435a74047ac" exitCode=0 Mar 12 12:32:09.074879 master-0 kubenswrapper[13984]: I0312 12:32:09.074688 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" event={"ID":"5a012d0b-d1a8-4cd3-8b91-b346d0445f24","Type":"ContainerDied","Data":"34dc0209ff1d7a04ba94a94ac3d8ef4ae5cf2a5dd4532cb0a823e435a74047ac"} Mar 12 12:32:09.076784 master-0 kubenswrapper[13984]: I0312 12:32:09.075065 13984 scope.go:117] "RemoveContainer" containerID="34dc0209ff1d7a04ba94a94ac3d8ef4ae5cf2a5dd4532cb0a823e435a74047ac" Mar 12 12:32:09.080633 master-0 kubenswrapper[13984]: I0312 12:32:09.077748 13984 generic.go:334] "Generic (PLEG): container finished" podID="021b22e3-b4c5-426d-b761-181f1e54175d" containerID="69a3615121435266c91b06fdc9e703ea81cecb073e9b8530439411a8ada925fe" exitCode=0 Mar 12 12:32:09.080633 master-0 kubenswrapper[13984]: I0312 12:32:09.078346 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" event={"ID":"021b22e3-b4c5-426d-b761-181f1e54175d","Type":"ContainerDied","Data":"69a3615121435266c91b06fdc9e703ea81cecb073e9b8530439411a8ada925fe"} Mar 12 12:32:09.080633 master-0 kubenswrapper[13984]: I0312 12:32:09.079308 13984 scope.go:117] "RemoveContainer" containerID="69a3615121435266c91b06fdc9e703ea81cecb073e9b8530439411a8ada925fe" Mar 12 12:32:09.084759 master-0 kubenswrapper[13984]: I0312 12:32:09.084059 13984 generic.go:334] "Generic (PLEG): container finished" podID="9ec202b2-e051-466b-a3fb-c054db0a9b16" containerID="1df1f496be3b4e1e8b0655b191e9ae344f113db45974d4edcfe571cf0c573c1d" exitCode=0 Mar 12 12:32:09.084759 master-0 kubenswrapper[13984]: I0312 12:32:09.084130 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" event={"ID":"9ec202b2-e051-466b-a3fb-c054db0a9b16","Type":"ContainerDied","Data":"1df1f496be3b4e1e8b0655b191e9ae344f113db45974d4edcfe571cf0c573c1d"} Mar 12 12:32:09.086474 master-0 kubenswrapper[13984]: I0312 12:32:09.085280 13984 scope.go:117] "RemoveContainer" containerID="1df1f496be3b4e1e8b0655b191e9ae344f113db45974d4edcfe571cf0c573c1d" Mar 12 12:32:09.088355 master-0 kubenswrapper[13984]: I0312 12:32:09.088247 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/3.log" Mar 12 12:32:09.089350 master-0 kubenswrapper[13984]: I0312 12:32:09.089299 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/2.log" Mar 12 12:32:09.090452 master-0 kubenswrapper[13984]: I0312 12:32:09.090393 13984 generic.go:334] "Generic (PLEG): container finished" podID="632651f7-6641-49d8-9c48-7f6ea5846538" containerID="f0fb0da09e8c36b2b08b8d5e0695af6a8eeabc526c30a2a41b1610b3f1603e7d" exitCode=255 Mar 12 12:32:09.090543 master-0 kubenswrapper[13984]: I0312 12:32:09.090522 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerDied","Data":"f0fb0da09e8c36b2b08b8d5e0695af6a8eeabc526c30a2a41b1610b3f1603e7d"} Mar 12 12:32:09.091145 master-0 kubenswrapper[13984]: I0312 12:32:09.091103 13984 scope.go:117] "RemoveContainer" containerID="f0fb0da09e8c36b2b08b8d5e0695af6a8eeabc526c30a2a41b1610b3f1603e7d" Mar 12 12:32:09.091445 master-0 kubenswrapper[13984]: E0312 12:32:09.091396 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-ph7gk_openshift-machine-api(632651f7-6641-49d8-9c48-7f6ea5846538)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" podUID="632651f7-6641-49d8-9c48-7f6ea5846538" Mar 12 12:32:09.108525 master-0 kubenswrapper[13984]: I0312 12:32:09.108360 13984 generic.go:334] "Generic (PLEG): container finished" podID="ab087440-bdf2-4e2f-9a5a-434d50a2329a" containerID="9561b49f3e6e4bd4b2bd8099eb3e3f1cd67c706451785c2fd2a2d282caaf82ac" exitCode=0 Mar 12 12:32:09.108525 master-0 kubenswrapper[13984]: I0312 12:32:09.108464 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" event={"ID":"ab087440-bdf2-4e2f-9a5a-434d50a2329a","Type":"ContainerDied","Data":"9561b49f3e6e4bd4b2bd8099eb3e3f1cd67c706451785c2fd2a2d282caaf82ac"} Mar 12 12:32:09.109597 master-0 kubenswrapper[13984]: I0312 12:32:09.109548 13984 scope.go:117] "RemoveContainer" containerID="9561b49f3e6e4bd4b2bd8099eb3e3f1cd67c706451785c2fd2a2d282caaf82ac" Mar 12 12:32:09.183871 master-0 kubenswrapper[13984]: I0312 12:32:09.183798 13984 scope.go:117] "RemoveContainer" containerID="d357ccc688b993b9454b28bfd7fb28a5d58ecf020cbf9839477bf958a0d7b96f" Mar 12 12:32:09.288590 master-0 kubenswrapper[13984]: I0312 12:32:09.288543 13984 scope.go:117] "RemoveContainer" containerID="da81346b3001ee955f12da9dabc7c3b2591cdb7d6466508682550d487c8ccccc" Mar 12 12:32:09.344968 master-0 kubenswrapper[13984]: I0312 12:32:09.344927 13984 scope.go:117] "RemoveContainer" containerID="c10ad00e6d5ca94dd8aee1068bcae9a35fd5744bbc9fa9703850b00e0063db31" Mar 12 12:32:09.659521 master-0 kubenswrapper[13984]: I0312 12:32:09.657942 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:32:09.659521 master-0 kubenswrapper[13984]: I0312 12:32:09.658003 13984 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:32:09.981502 master-0 kubenswrapper[13984]: I0312 12:32:09.980952 13984 scope.go:117] "RemoveContainer" containerID="cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0" Mar 12 12:32:09.981502 master-0 kubenswrapper[13984]: E0312 12:32:09.981153 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kf7kw_openshift-cluster-storage-operator(7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" podUID="7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef" Mar 12 12:32:10.090342 master-0 kubenswrapper[13984]: I0312 12:32:10.090297 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:32:10.116056 master-0 kubenswrapper[13984]: I0312 12:32:10.116021 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/3.log" Mar 12 12:32:10.118394 master-0 kubenswrapper[13984]: I0312 12:32:10.118364 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-7nb6b" event={"ID":"ab087440-bdf2-4e2f-9a5a-434d50a2329a","Type":"ContainerStarted","Data":"57c41a766e7e6ebe04ce8d1605da47b12cd7a6cbc6f3888c76880a02e4ad4025"} Mar 12 12:32:10.122885 master-0 kubenswrapper[13984]: I0312 12:32:10.122846 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" event={"ID":"a154f648-b96d-449e-b0f5-ba32266000c2","Type":"ContainerStarted","Data":"e0954f1a37443173f27cbe500220e8341d66c7d9375fefee39a7fe552542b847"} Mar 12 12:32:10.123030 master-0 kubenswrapper[13984]: I0312 12:32:10.123009 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:32:10.124731 master-0 kubenswrapper[13984]: I0312 12:32:10.124704 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-gxx99" event={"ID":"15bf86d9-62b3-4af8-b6f6-23131d712332","Type":"ContainerStarted","Data":"081dfd020f6cc869cf892b07c34d0a7e2d7819a228f57dc9b95e6e8ec12e842a"} Mar 12 12:32:10.127233 master-0 kubenswrapper[13984]: I0312 12:32:10.127185 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" event={"ID":"021b22e3-b4c5-426d-b761-181f1e54175d","Type":"ContainerStarted","Data":"7cc6bae69d9bd535eb5f09007bd50d092688d8fbbf463b94e2879d9bbc8321ea"} Mar 12 12:32:10.127503 master-0 kubenswrapper[13984]: I0312 12:32:10.127469 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:32:10.129937 master-0 kubenswrapper[13984]: I0312 12:32:10.129912 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-9vtjp_b9194868-75ce-4138-a9d4-ddd64660c529/cluster-node-tuning-operator/0.log" Mar 12 12:32:10.129996 master-0 kubenswrapper[13984]: I0312 12:32:10.129974 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-9vtjp" event={"ID":"b9194868-75ce-4138-a9d4-ddd64660c529","Type":"ContainerStarted","Data":"4f6d2f047940fec859eac0bf5b5273f89df41bda70171f45ff07c9729040d754"} Mar 12 12:32:10.136340 master-0 kubenswrapper[13984]: I0312 12:32:10.136299 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-qpd6h" event={"ID":"0aeeef2a-f9df-4f87-b985-bd1da94c76c3","Type":"ContainerStarted","Data":"299abeec72e97921b49abf8b12743ad011d12e72a09459a3feb758140e6ab88d"} Mar 12 12:32:10.144308 master-0 kubenswrapper[13984]: I0312 12:32:10.143760 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-lxvgd" event={"ID":"99a11fe6-48a1-439e-b788-158dbe267dcd","Type":"ContainerStarted","Data":"7a1082a562e6a1cfc3c08828697629e2b19b21fcab6d79bc6dc22725cc485ee3"} Mar 12 12:32:10.154491 master-0 kubenswrapper[13984]: I0312 12:32:10.153404 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-vmj4h" event={"ID":"5a012d0b-d1a8-4cd3-8b91-b346d0445f24","Type":"ContainerStarted","Data":"b5b3eb3d1ffdc24250f914200b99f7489898c3b4607750c03c1ae50f5697d8c6"} Mar 12 12:32:10.167493 master-0 kubenswrapper[13984]: I0312 12:32:10.166991 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-5q4fw" event={"ID":"f3b704e7-1291-4645-8a0d-2a937829d7ac","Type":"ContainerStarted","Data":"4b43be1569798069805f6a53dcf5ff8b0103307a876d6d7a4bb2197afe5292c0"} Mar 12 12:32:10.171492 master-0 kubenswrapper[13984]: I0312 12:32:10.170810 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kcnf4" event={"ID":"cfd178d7-f518-413b-95ab-ab6687be6e0f","Type":"ContainerStarted","Data":"e05ad26b9733254366e4181661505b6fd1198c883fba16499fcf1c227617450b"} Mar 12 12:32:10.177495 master-0 kubenswrapper[13984]: I0312 12:32:10.176167 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-hdhhs" event={"ID":"9ec202b2-e051-466b-a3fb-c054db0a9b16","Type":"ContainerStarted","Data":"88d87d834a72ee91e8c276b2fdc40af936cc608218e6347840ab1f83c2200388"} Mar 12 12:32:10.221914 master-0 kubenswrapper[13984]: I0312 12:32:10.221874 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:32:10.433938 master-0 kubenswrapper[13984]: I0312 12:32:10.433828 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:32:10.439234 master-0 kubenswrapper[13984]: I0312 12:32:10.439189 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:32:12.661837 master-0 kubenswrapper[13984]: I0312 12:32:12.661762 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-fg5mg" Mar 12 12:32:16.233081 master-0 kubenswrapper[13984]: I0312 12:32:16.232947 13984 generic.go:334] "Generic (PLEG): container finished" podID="114b1d16-b37d-449c-84e3-3fb3f8b20eaa" containerID="d4a43fbaaf0cd2d913ef5abe0c226480d6e639296585062ad2bcfe4ba7c6658f" exitCode=0 Mar 12 12:32:16.233081 master-0 kubenswrapper[13984]: I0312 12:32:16.233028 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" event={"ID":"114b1d16-b37d-449c-84e3-3fb3f8b20eaa","Type":"ContainerDied","Data":"d4a43fbaaf0cd2d913ef5abe0c226480d6e639296585062ad2bcfe4ba7c6658f"} Mar 12 12:32:16.234128 master-0 kubenswrapper[13984]: I0312 12:32:16.233809 13984 scope.go:117] "RemoveContainer" containerID="d4a43fbaaf0cd2d913ef5abe0c226480d6e639296585062ad2bcfe4ba7c6658f" Mar 12 12:32:17.243858 master-0 kubenswrapper[13984]: I0312 12:32:17.243745 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-7dg9w" event={"ID":"114b1d16-b37d-449c-84e3-3fb3f8b20eaa","Type":"ContainerStarted","Data":"7efd5c6fb6e4eeb485d8f58f205ce5031024e3b5871caabe75005dbb151ea8f1"} Mar 12 12:32:19.191606 master-0 kubenswrapper[13984]: I0312 12:32:19.191536 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 12 12:32:19.192123 master-0 kubenswrapper[13984]: E0312 12:32:19.191935 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ef1ec5-3e17-485a-9797-566ef207fa0a" containerName="installer" Mar 12 12:32:19.192123 master-0 kubenswrapper[13984]: I0312 12:32:19.191953 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ef1ec5-3e17-485a-9797-566ef207fa0a" containerName="installer" Mar 12 12:32:19.192123 master-0 kubenswrapper[13984]: I0312 12:32:19.192121 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ef1ec5-3e17-485a-9797-566ef207fa0a" containerName="installer" Mar 12 12:32:19.192748 master-0 kubenswrapper[13984]: I0312 12:32:19.192723 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.194344 master-0 kubenswrapper[13984]: I0312 12:32:19.194312 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-tzq7l" Mar 12 12:32:19.194489 master-0 kubenswrapper[13984]: I0312 12:32:19.194443 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 12 12:32:19.217957 master-0 kubenswrapper[13984]: I0312 12:32:19.217896 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 12 12:32:19.266623 master-0 kubenswrapper[13984]: I0312 12:32:19.266572 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/950b563e-a63d-4f3b-b179-f4a2f071739f-kube-api-access\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.266821 master-0 kubenswrapper[13984]: I0312 12:32:19.266761 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.267099 master-0 kubenswrapper[13984]: I0312 12:32:19.267072 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-var-lock\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.368056 master-0 kubenswrapper[13984]: I0312 12:32:19.367992 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-var-lock\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.368282 master-0 kubenswrapper[13984]: I0312 12:32:19.368078 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/950b563e-a63d-4f3b-b179-f4a2f071739f-kube-api-access\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.368282 master-0 kubenswrapper[13984]: I0312 12:32:19.368112 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.368282 master-0 kubenswrapper[13984]: I0312 12:32:19.368131 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-var-lock\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.368282 master-0 kubenswrapper[13984]: I0312 12:32:19.368177 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.388153 master-0 kubenswrapper[13984]: I0312 12:32:19.388091 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/950b563e-a63d-4f3b-b179-f4a2f071739f-kube-api-access\") pod \"installer-5-master-0\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.508301 master-0 kubenswrapper[13984]: I0312 12:32:19.508210 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:19.944640 master-0 kubenswrapper[13984]: I0312 12:32:19.944577 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 12 12:32:19.952261 master-0 kubenswrapper[13984]: W0312 12:32:19.952203 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod950b563e_a63d_4f3b_b179_f4a2f071739f.slice/crio-78a7d4ac3a18f185f9cbbfda2f2311015baaf728740462b3b347f4063e780ec7 WatchSource:0}: Error finding container 78a7d4ac3a18f185f9cbbfda2f2311015baaf728740462b3b347f4063e780ec7: Status 404 returned error can't find the container with id 78a7d4ac3a18f185f9cbbfda2f2311015baaf728740462b3b347f4063e780ec7 Mar 12 12:32:20.096060 master-0 kubenswrapper[13984]: I0312 12:32:20.096019 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:32:20.268784 master-0 kubenswrapper[13984]: I0312 12:32:20.268656 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"950b563e-a63d-4f3b-b179-f4a2f071739f","Type":"ContainerStarted","Data":"78a7d4ac3a18f185f9cbbfda2f2311015baaf728740462b3b347f4063e780ec7"} Mar 12 12:32:21.294516 master-0 kubenswrapper[13984]: I0312 12:32:21.293906 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"950b563e-a63d-4f3b-b179-f4a2f071739f","Type":"ContainerStarted","Data":"f2eacdd07a16a2435dfc8f871501cf98e3c9bcb25686aed1ec3614fe561e3f1b"} Mar 12 12:32:21.336537 master-0 kubenswrapper[13984]: I0312 12:32:21.334500 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=2.334462722 podStartE2EDuration="2.334462722s" podCreationTimestamp="2026-03-12 12:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:21.328989972 +0000 UTC m=+473.527005474" watchObservedRunningTime="2026-03-12 12:32:21.334462722 +0000 UTC m=+473.532478214" Mar 12 12:32:21.980973 master-0 kubenswrapper[13984]: I0312 12:32:21.980883 13984 scope.go:117] "RemoveContainer" containerID="f0fb0da09e8c36b2b08b8d5e0695af6a8eeabc526c30a2a41b1610b3f1603e7d" Mar 12 12:32:21.981427 master-0 kubenswrapper[13984]: E0312 12:32:21.981367 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-autoscaler-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-autoscaler-operator pod=cluster-autoscaler-operator-69576476f7-ph7gk_openshift-machine-api(632651f7-6641-49d8-9c48-7f6ea5846538)\"" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" podUID="632651f7-6641-49d8-9c48-7f6ea5846538" Mar 12 12:32:22.980137 master-0 kubenswrapper[13984]: I0312 12:32:22.980081 13984 scope.go:117] "RemoveContainer" containerID="a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621" Mar 12 12:32:22.980873 master-0 kubenswrapper[13984]: E0312 12:32:22.980331 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-pb97p_openshift-machine-api(c7d2a100-a24a-4ae6-bd8e-4530163a3ffe)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" podUID="c7d2a100-a24a-4ae6-bd8e-4530163a3ffe" Mar 12 12:32:23.981050 master-0 kubenswrapper[13984]: I0312 12:32:23.980979 13984 scope.go:117] "RemoveContainer" containerID="cba7d533b5bc6aa262c2d422e177581645a93f32a2a9e5df1bae21e883522eb0" Mar 12 12:32:24.311906 master-0 kubenswrapper[13984]: I0312 12:32:24.311790 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/3.log" Mar 12 12:32:24.311906 master-0 kubenswrapper[13984]: I0312 12:32:24.311848 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kf7kw" event={"ID":"7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef","Type":"ContainerStarted","Data":"18345948b33c52111b9624adb5fdac12a2ec981d2a30a2b1b774750587e6cb90"} Mar 12 12:32:27.751670 master-0 kubenswrapper[13984]: I0312 12:32:27.750541 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t"] Mar 12 12:32:27.752172 master-0 kubenswrapper[13984]: I0312 12:32:27.752132 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:27.754367 master-0 kubenswrapper[13984]: I0312 12:32:27.754317 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-dsw4r" Mar 12 12:32:27.763500 master-0 kubenswrapper[13984]: I0312 12:32:27.760542 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf"] Mar 12 12:32:27.763500 master-0 kubenswrapper[13984]: I0312 12:32:27.760814 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" containerID="cri-o://7cc6bae69d9bd535eb5f09007bd50d092688d8fbbf463b94e2879d9bbc8321ea" gracePeriod=30 Mar 12 12:32:27.775758 master-0 kubenswrapper[13984]: I0312 12:32:27.775552 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t"] Mar 12 12:32:27.794212 master-0 kubenswrapper[13984]: I0312 12:32:27.791099 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f87d47d96-c24tv"] Mar 12 12:32:27.794212 master-0 kubenswrapper[13984]: I0312 12:32:27.791403 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" containerID="cri-o://fccd1d387d9f834932469bff2fb734f1ceea36c669a79dc7f029e4319a59c287" gracePeriod=30 Mar 12 12:32:27.801908 master-0 kubenswrapper[13984]: I0312 12:32:27.800328 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9rdn5"] Mar 12 12:32:27.801908 master-0 kubenswrapper[13984]: I0312 12:32:27.801503 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:27.802642 master-0 kubenswrapper[13984]: I0312 12:32:27.802599 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvlq\" (UniqueName: \"kubernetes.io/projected/ae1447f8-82b1-4685-8806-a8ae6e802c15-kube-api-access-4wvlq\") pod \"multus-admission-controller-56bbfd46b8-qmv2t\" (UID: \"ae1447f8-82b1-4685-8806-a8ae6e802c15\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:27.803099 master-0 kubenswrapper[13984]: I0312 12:32:27.802835 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae1447f8-82b1-4685-8806-a8ae6e802c15-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-qmv2t\" (UID: \"ae1447f8-82b1-4685-8806-a8ae6e802c15\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:27.814541 master-0 kubenswrapper[13984]: I0312 12:32:27.808544 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 12 12:32:27.814541 master-0 kubenswrapper[13984]: I0312 12:32:27.808608 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-glbz2" Mar 12 12:32:27.882203 master-0 kubenswrapper[13984]: I0312 12:32:27.882130 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6d9f596674-hxrr7"] Mar 12 12:32:27.885553 master-0 kubenswrapper[13984]: I0312 12:32:27.885346 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.891672 master-0 kubenswrapper[13984]: I0312 12:32:27.891451 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 12 12:32:27.891859 master-0 kubenswrapper[13984]: I0312 12:32:27.891835 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 12 12:32:27.891978 master-0 kubenswrapper[13984]: I0312 12:32:27.891961 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 12 12:32:27.893742 master-0 kubenswrapper[13984]: I0312 12:32:27.892148 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 12 12:32:27.897565 master-0 kubenswrapper[13984]: I0312 12:32:27.895846 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-pbjx7" Mar 12 12:32:27.899522 master-0 kubenswrapper[13984]: I0312 12:32:27.897792 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 12 12:32:27.919232 master-0 kubenswrapper[13984]: I0312 12:32:27.919195 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d9f596674-hxrr7"] Mar 12 12:32:27.936380 master-0 kubenswrapper[13984]: I0312 12:32:27.936316 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-federate-client-tls\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.936380 master-0 kubenswrapper[13984]: I0312 12:32:27.936379 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-telemeter-client-tls\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.936725 master-0 kubenswrapper[13984]: I0312 12:32:27.936550 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsf7w\" (UniqueName: \"kubernetes.io/projected/eb40fb4f-b815-4850-b446-7bd3aae4c637-kube-api-access-fsf7w\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:27.936725 master-0 kubenswrapper[13984]: I0312 12:32:27.936595 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae1447f8-82b1-4685-8806-a8ae6e802c15-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-qmv2t\" (UID: \"ae1447f8-82b1-4685-8806-a8ae6e802c15\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:27.936725 master-0 kubenswrapper[13984]: I0312 12:32:27.936648 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/eb40fb4f-b815-4850-b446-7bd3aae4c637-ready\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:27.936725 master-0 kubenswrapper[13984]: I0312 12:32:27.936694 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.936921 master-0 kubenswrapper[13984]: I0312 12:32:27.936758 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvlq\" (UniqueName: \"kubernetes.io/projected/ae1447f8-82b1-4685-8806-a8ae6e802c15-kube-api-access-4wvlq\") pod \"multus-admission-controller-56bbfd46b8-qmv2t\" (UID: \"ae1447f8-82b1-4685-8806-a8ae6e802c15\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:27.936921 master-0 kubenswrapper[13984]: I0312 12:32:27.936792 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-secret-telemeter-client\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.936921 master-0 kubenswrapper[13984]: I0312 12:32:27.936879 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb40fb4f-b815-4850-b446-7bd3aae4c637-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:27.937062 master-0 kubenswrapper[13984]: I0312 12:32:27.936945 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-serving-certs-ca-bundle\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.940387 master-0 kubenswrapper[13984]: I0312 12:32:27.939626 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb40fb4f-b815-4850-b446-7bd3aae4c637-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:27.940387 master-0 kubenswrapper[13984]: I0312 12:32:27.939678 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-metrics-client-ca\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.940387 master-0 kubenswrapper[13984]: I0312 12:32:27.939779 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.940387 master-0 kubenswrapper[13984]: I0312 12:32:27.939872 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjm4q\" (UniqueName: \"kubernetes.io/projected/790b9d8a-c3b9-41f6-8333-342e76485a8d-kube-api-access-qjm4q\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:27.957958 master-0 kubenswrapper[13984]: I0312 12:32:27.957905 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 12 12:32:27.966854 master-0 kubenswrapper[13984]: I0312 12:32:27.965134 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae1447f8-82b1-4685-8806-a8ae6e802c15-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-qmv2t\" (UID: \"ae1447f8-82b1-4685-8806-a8ae6e802c15\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:27.994367 master-0 kubenswrapper[13984]: I0312 12:32:27.994330 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvlq\" (UniqueName: \"kubernetes.io/projected/ae1447f8-82b1-4685-8806-a8ae6e802c15-kube-api-access-4wvlq\") pod \"multus-admission-controller-56bbfd46b8-qmv2t\" (UID: \"ae1447f8-82b1-4685-8806-a8ae6e802c15\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:28.053662 master-0 kubenswrapper[13984]: I0312 12:32:28.053575 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb40fb4f-b815-4850-b446-7bd3aae4c637-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.053811 master-0 kubenswrapper[13984]: I0312 12:32:28.053677 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-serving-certs-ca-bundle\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.053811 master-0 kubenswrapper[13984]: I0312 12:32:28.053715 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb40fb4f-b815-4850-b446-7bd3aae4c637-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.053811 master-0 kubenswrapper[13984]: I0312 12:32:28.053734 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-metrics-client-ca\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.053811 master-0 kubenswrapper[13984]: I0312 12:32:28.053775 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.053934 master-0 kubenswrapper[13984]: I0312 12:32:28.053830 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjm4q\" (UniqueName: \"kubernetes.io/projected/790b9d8a-c3b9-41f6-8333-342e76485a8d-kube-api-access-qjm4q\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.055149 master-0 kubenswrapper[13984]: I0312 12:32:28.055126 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-federate-client-tls\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.055303 master-0 kubenswrapper[13984]: I0312 12:32:28.055191 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-telemeter-client-tls\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.055303 master-0 kubenswrapper[13984]: I0312 12:32:28.055218 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsf7w\" (UniqueName: \"kubernetes.io/projected/eb40fb4f-b815-4850-b446-7bd3aae4c637-kube-api-access-fsf7w\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.055303 master-0 kubenswrapper[13984]: I0312 12:32:28.055248 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/eb40fb4f-b815-4850-b446-7bd3aae4c637-ready\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.055303 master-0 kubenswrapper[13984]: I0312 12:32:28.055264 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 12 12:32:28.055303 master-0 kubenswrapper[13984]: I0312 12:32:28.055271 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.055303 master-0 kubenswrapper[13984]: I0312 12:32:28.055301 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-secret-telemeter-client\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.056541 master-0 kubenswrapper[13984]: I0312 12:32:28.055471 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb40fb4f-b815-4850-b446-7bd3aae4c637-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.056541 master-0 kubenswrapper[13984]: I0312 12:32:28.055870 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/eb40fb4f-b815-4850-b446-7bd3aae4c637-ready\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.056912 master-0 kubenswrapper[13984]: I0312 12:32:28.056893 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 12 12:32:28.057297 master-0 kubenswrapper[13984]: I0312 12:32:28.057132 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 12 12:32:28.058223 master-0 kubenswrapper[13984]: I0312 12:32:28.057359 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 12 12:32:28.058223 master-0 kubenswrapper[13984]: I0312 12:32:28.057989 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 12 12:32:28.061690 master-0 kubenswrapper[13984]: I0312 12:32:28.058340 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 12 12:32:28.061690 master-0 kubenswrapper[13984]: I0312 12:32:28.060880 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.062223 master-0 kubenswrapper[13984]: I0312 12:32:28.062191 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-metrics-client-ca\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.066731 master-0 kubenswrapper[13984]: I0312 12:32:28.066414 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/790b9d8a-c3b9-41f6-8333-342e76485a8d-serving-certs-ca-bundle\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.068093 master-0 kubenswrapper[13984]: I0312 12:32:28.068058 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb40fb4f-b815-4850-b446-7bd3aae4c637-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.069784 master-0 kubenswrapper[13984]: I0312 12:32:28.069640 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-telemeter-client-tls\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.071850 master-0 kubenswrapper[13984]: I0312 12:32:28.071370 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-secret-telemeter-client\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.071850 master-0 kubenswrapper[13984]: I0312 12:32:28.071797 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-federate-client-tls\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.072236 master-0 kubenswrapper[13984]: I0312 12:32:28.072202 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/790b9d8a-c3b9-41f6-8333-342e76485a8d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.073265 master-0 kubenswrapper[13984]: I0312 12:32:28.073231 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjm4q\" (UniqueName: \"kubernetes.io/projected/790b9d8a-c3b9-41f6-8333-342e76485a8d-kube-api-access-qjm4q\") pod \"telemeter-client-6d9f596674-hxrr7\" (UID: \"790b9d8a-c3b9-41f6-8333-342e76485a8d\") " pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.080842 master-0 kubenswrapper[13984]: I0312 12:32:28.080784 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-dsw4r" Mar 12 12:32:28.091978 master-0 kubenswrapper[13984]: I0312 12:32:28.089908 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" Mar 12 12:32:28.091978 master-0 kubenswrapper[13984]: I0312 12:32:28.090778 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsf7w\" (UniqueName: \"kubernetes.io/projected/eb40fb4f-b815-4850-b446-7bd3aae4c637-kube-api-access-fsf7w\") pod \"cni-sysctl-allowlist-ds-9rdn5\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.348626 master-0 kubenswrapper[13984]: I0312 12:32:28.348511 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-glbz2" Mar 12 12:32:28.352899 master-0 kubenswrapper[13984]: I0312 12:32:28.352849 13984 generic.go:334] "Generic (PLEG): container finished" podID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerID="fccd1d387d9f834932469bff2fb734f1ceea36c669a79dc7f029e4319a59c287" exitCode=0 Mar 12 12:32:28.353051 master-0 kubenswrapper[13984]: I0312 12:32:28.352910 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" event={"ID":"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0","Type":"ContainerDied","Data":"fccd1d387d9f834932469bff2fb734f1ceea36c669a79dc7f029e4319a59c287"} Mar 12 12:32:28.353051 master-0 kubenswrapper[13984]: I0312 12:32:28.352941 13984 scope.go:117] "RemoveContainer" containerID="cddd485ac51118295f95288e1b47d7560d221f73530c7f79d262c83d31f5faa3" Mar 12 12:32:28.356071 master-0 kubenswrapper[13984]: I0312 12:32:28.355623 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:28.357587 master-0 kubenswrapper[13984]: I0312 12:32:28.357526 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-pbjx7" Mar 12 12:32:28.363944 master-0 kubenswrapper[13984]: I0312 12:32:28.359237 13984 generic.go:334] "Generic (PLEG): container finished" podID="021b22e3-b4c5-426d-b761-181f1e54175d" containerID="7cc6bae69d9bd535eb5f09007bd50d092688d8fbbf463b94e2879d9bbc8321ea" exitCode=0 Mar 12 12:32:28.363944 master-0 kubenswrapper[13984]: I0312 12:32:28.361913 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" event={"ID":"021b22e3-b4c5-426d-b761-181f1e54175d","Type":"ContainerDied","Data":"7cc6bae69d9bd535eb5f09007bd50d092688d8fbbf463b94e2879d9bbc8321ea"} Mar 12 12:32:28.365404 master-0 kubenswrapper[13984]: I0312 12:32:28.365364 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" Mar 12 12:32:28.377254 master-0 kubenswrapper[13984]: I0312 12:32:28.377210 13984 scope.go:117] "RemoveContainer" containerID="69a3615121435266c91b06fdc9e703ea81cecb073e9b8530439411a8ada925fe" Mar 12 12:32:28.405507 master-0 kubenswrapper[13984]: I0312 12:32:28.402098 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:32:28.553800 master-0 kubenswrapper[13984]: I0312 12:32:28.552330 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t"] Mar 12 12:32:28.579441 master-0 kubenswrapper[13984]: I0312 12:32:28.578361 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") pod \"021b22e3-b4c5-426d-b761-181f1e54175d\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " Mar 12 12:32:28.579441 master-0 kubenswrapper[13984]: I0312 12:32:28.578439 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") pod \"021b22e3-b4c5-426d-b761-181f1e54175d\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " Mar 12 12:32:28.579441 master-0 kubenswrapper[13984]: I0312 12:32:28.578496 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") pod \"021b22e3-b4c5-426d-b761-181f1e54175d\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " Mar 12 12:32:28.579441 master-0 kubenswrapper[13984]: I0312 12:32:28.578559 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") pod \"021b22e3-b4c5-426d-b761-181f1e54175d\" (UID: \"021b22e3-b4c5-426d-b761-181f1e54175d\") " Mar 12 12:32:28.581399 master-0 kubenswrapper[13984]: I0312 12:32:28.580870 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca" (OuterVolumeSpecName: "client-ca") pod "021b22e3-b4c5-426d-b761-181f1e54175d" (UID: "021b22e3-b4c5-426d-b761-181f1e54175d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:32:28.582264 master-0 kubenswrapper[13984]: I0312 12:32:28.582225 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config" (OuterVolumeSpecName: "config") pod "021b22e3-b4c5-426d-b761-181f1e54175d" (UID: "021b22e3-b4c5-426d-b761-181f1e54175d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:32:28.585849 master-0 kubenswrapper[13984]: I0312 12:32:28.585802 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "021b22e3-b4c5-426d-b761-181f1e54175d" (UID: "021b22e3-b4c5-426d-b761-181f1e54175d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:32:28.585937 master-0 kubenswrapper[13984]: I0312 12:32:28.585814 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6" (OuterVolumeSpecName: "kube-api-access-xggp6") pod "021b22e3-b4c5-426d-b761-181f1e54175d" (UID: "021b22e3-b4c5-426d-b761-181f1e54175d"). InnerVolumeSpecName "kube-api-access-xggp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:32:28.700517 master-0 kubenswrapper[13984]: I0312 12:32:28.696691 13984 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:28.700517 master-0 kubenswrapper[13984]: I0312 12:32:28.696736 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/021b22e3-b4c5-426d-b761-181f1e54175d-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:28.700517 master-0 kubenswrapper[13984]: I0312 12:32:28.696751 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xggp6\" (UniqueName: \"kubernetes.io/projected/021b22e3-b4c5-426d-b761-181f1e54175d-kube-api-access-xggp6\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:28.700517 master-0 kubenswrapper[13984]: I0312 12:32:28.696763 13984 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/021b22e3-b4c5-426d-b761-181f1e54175d-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:28.767629 master-0 kubenswrapper[13984]: I0312 12:32:28.767578 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6d9f596674-hxrr7"] Mar 12 12:32:28.777228 master-0 kubenswrapper[13984]: I0312 12:32:28.777151 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:32:28.779176 master-0 kubenswrapper[13984]: W0312 12:32:28.779115 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790b9d8a_c3b9_41f6_8333_342e76485a8d.slice/crio-e71faf407a7d06a89fc13d4fd5ba1f42b3960ea9072e0b8f2583e9866829acc6 WatchSource:0}: Error finding container e71faf407a7d06a89fc13d4fd5ba1f42b3960ea9072e0b8f2583e9866829acc6: Status 404 returned error can't find the container with id e71faf407a7d06a89fc13d4fd5ba1f42b3960ea9072e0b8f2583e9866829acc6 Mar 12 12:32:28.784372 master-0 kubenswrapper[13984]: I0312 12:32:28.784309 13984 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:32:28.899566 master-0 kubenswrapper[13984]: I0312 12:32:28.899512 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") pod \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " Mar 12 12:32:28.899716 master-0 kubenswrapper[13984]: I0312 12:32:28.899592 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") pod \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " Mar 12 12:32:28.899716 master-0 kubenswrapper[13984]: I0312 12:32:28.899637 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") pod \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " Mar 12 12:32:28.899716 master-0 kubenswrapper[13984]: I0312 12:32:28.899661 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") pod \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " Mar 12 12:32:28.899716 master-0 kubenswrapper[13984]: I0312 12:32:28.899687 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") pod \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\" (UID: \"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0\") " Mar 12 12:32:28.900956 master-0 kubenswrapper[13984]: I0312 12:32:28.900921 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca" (OuterVolumeSpecName: "client-ca") pod "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" (UID: "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:32:28.900956 master-0 kubenswrapper[13984]: I0312 12:32:28.900960 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" (UID: "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:32:28.901167 master-0 kubenswrapper[13984]: I0312 12:32:28.901126 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config" (OuterVolumeSpecName: "config") pod "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" (UID: "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:32:28.906011 master-0 kubenswrapper[13984]: I0312 12:32:28.905961 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" (UID: "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:32:28.906923 master-0 kubenswrapper[13984]: I0312 12:32:28.906889 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7" (OuterVolumeSpecName: "kube-api-access-btbt7") pod "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" (UID: "ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0"). InnerVolumeSpecName "kube-api-access-btbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:32:28.916069 master-0 kubenswrapper[13984]: I0312 12:32:28.916006 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs"] Mar 12 12:32:28.916309 master-0 kubenswrapper[13984]: E0312 12:32:28.916286 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" Mar 12 12:32:28.916309 master-0 kubenswrapper[13984]: I0312 12:32:28.916303 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" Mar 12 12:32:28.916377 master-0 kubenswrapper[13984]: E0312 12:32:28.916323 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" Mar 12 12:32:28.916377 master-0 kubenswrapper[13984]: I0312 12:32:28.916331 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" Mar 12 12:32:28.916377 master-0 kubenswrapper[13984]: E0312 12:32:28.916359 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" Mar 12 12:32:28.916377 master-0 kubenswrapper[13984]: I0312 12:32:28.916366 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" Mar 12 12:32:28.916377 master-0 kubenswrapper[13984]: E0312 12:32:28.916377 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" Mar 12 12:32:28.916585 master-0 kubenswrapper[13984]: I0312 12:32:28.916384 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" Mar 12 12:32:28.916585 master-0 kubenswrapper[13984]: I0312 12:32:28.916516 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" Mar 12 12:32:28.916585 master-0 kubenswrapper[13984]: I0312 12:32:28.916548 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" containerName="controller-manager" Mar 12 12:32:28.916585 master-0 kubenswrapper[13984]: I0312 12:32:28.916569 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" Mar 12 12:32:28.916585 master-0 kubenswrapper[13984]: I0312 12:32:28.916586 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" containerName="route-controller-manager" Mar 12 12:32:28.917034 master-0 kubenswrapper[13984]: I0312 12:32:28.917001 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:28.920150 master-0 kubenswrapper[13984]: I0312 12:32:28.920101 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vhqj5" Mar 12 12:32:28.930215 master-0 kubenswrapper[13984]: I0312 12:32:28.930157 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs"] Mar 12 12:32:29.001420 master-0 kubenswrapper[13984]: I0312 12:32:29.001347 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btbt7\" (UniqueName: \"kubernetes.io/projected/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-kube-api-access-btbt7\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:29.001420 master-0 kubenswrapper[13984]: I0312 12:32:29.001399 13984 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:29.001420 master-0 kubenswrapper[13984]: I0312 12:32:29.001413 13984 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:29.001420 master-0 kubenswrapper[13984]: I0312 12:32:29.001426 13984 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:29.001420 master-0 kubenswrapper[13984]: I0312 12:32:29.001438 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:29.102528 master-0 kubenswrapper[13984]: I0312 12:32:29.102447 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ebc5ac2-adf0-4e85-9685-962331cc2c11-serving-cert\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.102634 master-0 kubenswrapper[13984]: I0312 12:32:29.102592 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ebc5ac2-adf0-4e85-9685-962331cc2c11-client-ca\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.102756 master-0 kubenswrapper[13984]: I0312 12:32:29.102730 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ebc5ac2-adf0-4e85-9685-962331cc2c11-config\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.102797 master-0 kubenswrapper[13984]: I0312 12:32:29.102772 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8sx5\" (UniqueName: \"kubernetes.io/projected/9ebc5ac2-adf0-4e85-9685-962331cc2c11-kube-api-access-h8sx5\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.204602 master-0 kubenswrapper[13984]: I0312 12:32:29.204464 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ebc5ac2-adf0-4e85-9685-962331cc2c11-serving-cert\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.204879 master-0 kubenswrapper[13984]: I0312 12:32:29.204653 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ebc5ac2-adf0-4e85-9685-962331cc2c11-client-ca\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.205541 master-0 kubenswrapper[13984]: I0312 12:32:29.205258 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ebc5ac2-adf0-4e85-9685-962331cc2c11-config\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.205541 master-0 kubenswrapper[13984]: I0312 12:32:29.205335 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8sx5\" (UniqueName: \"kubernetes.io/projected/9ebc5ac2-adf0-4e85-9685-962331cc2c11-kube-api-access-h8sx5\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.206424 master-0 kubenswrapper[13984]: I0312 12:32:29.205765 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9ebc5ac2-adf0-4e85-9685-962331cc2c11-client-ca\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.206424 master-0 kubenswrapper[13984]: I0312 12:32:29.206362 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ebc5ac2-adf0-4e85-9685-962331cc2c11-config\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.208276 master-0 kubenswrapper[13984]: I0312 12:32:29.208230 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9ebc5ac2-adf0-4e85-9685-962331cc2c11-serving-cert\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.230606 master-0 kubenswrapper[13984]: I0312 12:32:29.230526 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8sx5\" (UniqueName: \"kubernetes.io/projected/9ebc5ac2-adf0-4e85-9685-962331cc2c11-kube-api-access-h8sx5\") pod \"route-controller-manager-7bd47cff7c-rt6qs\" (UID: \"9ebc5ac2-adf0-4e85-9685-962331cc2c11\") " pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.255366 master-0 kubenswrapper[13984]: I0312 12:32:29.255302 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:29.376622 master-0 kubenswrapper[13984]: I0312 12:32:29.376566 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" event={"ID":"021b22e3-b4c5-426d-b761-181f1e54175d","Type":"ContainerDied","Data":"de8fe1d3e9190ca438311b337298d3edddf0c73070b4d341a6bf06c706da4b32"} Mar 12 12:32:29.376724 master-0 kubenswrapper[13984]: I0312 12:32:29.376582 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf" Mar 12 12:32:29.376760 master-0 kubenswrapper[13984]: I0312 12:32:29.376637 13984 scope.go:117] "RemoveContainer" containerID="7cc6bae69d9bd535eb5f09007bd50d092688d8fbbf463b94e2879d9bbc8321ea" Mar 12 12:32:29.378016 master-0 kubenswrapper[13984]: I0312 12:32:29.377978 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" event={"ID":"eb40fb4f-b815-4850-b446-7bd3aae4c637","Type":"ContainerStarted","Data":"2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913"} Mar 12 12:32:29.378078 master-0 kubenswrapper[13984]: I0312 12:32:29.378021 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" event={"ID":"eb40fb4f-b815-4850-b446-7bd3aae4c637","Type":"ContainerStarted","Data":"0caf853d896ff411401c6bbbffb9603c1b1eebda4c622f846ea99046904d7425"} Mar 12 12:32:29.378691 master-0 kubenswrapper[13984]: I0312 12:32:29.378666 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:29.387034 master-0 kubenswrapper[13984]: I0312 12:32:29.385416 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" event={"ID":"790b9d8a-c3b9-41f6-8333-342e76485a8d","Type":"ContainerStarted","Data":"e71faf407a7d06a89fc13d4fd5ba1f42b3960ea9072e0b8f2583e9866829acc6"} Mar 12 12:32:29.392225 master-0 kubenswrapper[13984]: I0312 12:32:29.391309 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" event={"ID":"ae1447f8-82b1-4685-8806-a8ae6e802c15","Type":"ContainerStarted","Data":"2fd4809a4593ab5dcabf1f7942b965d4602f558c3b74e4462b0252e46da3971f"} Mar 12 12:32:29.392225 master-0 kubenswrapper[13984]: I0312 12:32:29.391364 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" event={"ID":"ae1447f8-82b1-4685-8806-a8ae6e802c15","Type":"ContainerStarted","Data":"53a563ebd989befea58dfaaf5f4b21b1f37b6b7d9f32c42d83c9172bffb1f6d0"} Mar 12 12:32:29.392225 master-0 kubenswrapper[13984]: I0312 12:32:29.391377 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" event={"ID":"ae1447f8-82b1-4685-8806-a8ae6e802c15","Type":"ContainerStarted","Data":"17b89e1b4ef4cad114590a8c20b14d26597b4da0aa0a31a62358980a9837ef71"} Mar 12 12:32:29.397577 master-0 kubenswrapper[13984]: I0312 12:32:29.394623 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" event={"ID":"ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0","Type":"ContainerDied","Data":"76740ed254dde85a7659405fc8bc23705b75543841b9b4c3cc5aae0ef87f43b9"} Mar 12 12:32:29.397577 master-0 kubenswrapper[13984]: I0312 12:32:29.394656 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f87d47d96-c24tv" Mar 12 12:32:29.401618 master-0 kubenswrapper[13984]: I0312 12:32:29.400905 13984 scope.go:117] "RemoveContainer" containerID="fccd1d387d9f834932469bff2fb734f1ceea36c669a79dc7f029e4319a59c287" Mar 12 12:32:29.421348 master-0 kubenswrapper[13984]: I0312 12:32:29.421192 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:32:29.422255 master-0 kubenswrapper[13984]: I0312 12:32:29.422225 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf"] Mar 12 12:32:29.425495 master-0 kubenswrapper[13984]: I0312 12:32:29.425416 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7665b44c8d-2lgnf"] Mar 12 12:32:29.441386 master-0 kubenswrapper[13984]: I0312 12:32:29.441221 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-56bbfd46b8-qmv2t" podStartSLOduration=2.441201683 podStartE2EDuration="2.441201683s" podCreationTimestamp="2026-03-12 12:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:29.433833768 +0000 UTC m=+481.631849280" watchObservedRunningTime="2026-03-12 12:32:29.441201683 +0000 UTC m=+481.639217175" Mar 12 12:32:29.472998 master-0 kubenswrapper[13984]: I0312 12:32:29.472891 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" podStartSLOduration=2.472854955 podStartE2EDuration="2.472854955s" podCreationTimestamp="2026-03-12 12:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:29.464164382 +0000 UTC m=+481.662179894" watchObservedRunningTime="2026-03-12 12:32:29.472854955 +0000 UTC m=+481.670870447" Mar 12 12:32:29.485800 master-0 kubenswrapper[13984]: I0312 12:32:29.485738 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-xpzn2"] Mar 12 12:32:29.486070 master-0 kubenswrapper[13984]: I0312 12:32:29.485997 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="multus-admission-controller" containerID="cri-o://e8a0643d2fccb7a3a8b1a9999fe3ff756f073c2a63121cc490f340f51eaf7001" gracePeriod=30 Mar 12 12:32:29.486175 master-0 kubenswrapper[13984]: I0312 12:32:29.486111 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="kube-rbac-proxy" containerID="cri-o://8cf7a7986d7461740b2a22c40e2265aaa2c172b0bc619290d4b6ae5354a29eb5" gracePeriod=30 Mar 12 12:32:29.520663 master-0 kubenswrapper[13984]: I0312 12:32:29.519679 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f87d47d96-c24tv"] Mar 12 12:32:29.533610 master-0 kubenswrapper[13984]: I0312 12:32:29.533458 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f87d47d96-c24tv"] Mar 12 12:32:29.660590 master-0 kubenswrapper[13984]: I0312 12:32:29.660528 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs"] Mar 12 12:32:29.672034 master-0 kubenswrapper[13984]: W0312 12:32:29.671934 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ebc5ac2_adf0_4e85_9685_962331cc2c11.slice/crio-a484ae91b72113b275a4df800a899300cc9b9a36a0f86bb76adceb54f6d79e57 WatchSource:0}: Error finding container a484ae91b72113b275a4df800a899300cc9b9a36a0f86bb76adceb54f6d79e57: Status 404 returned error can't find the container with id a484ae91b72113b275a4df800a899300cc9b9a36a0f86bb76adceb54f6d79e57 Mar 12 12:32:29.917885 master-0 kubenswrapper[13984]: I0312 12:32:29.917700 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-74559cc99b-4g2ft"] Mar 12 12:32:29.919351 master-0 kubenswrapper[13984]: I0312 12:32:29.919156 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:29.923205 master-0 kubenswrapper[13984]: I0312 12:32:29.923137 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-fc6g9" Mar 12 12:32:29.923281 master-0 kubenswrapper[13984]: I0312 12:32:29.923195 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:32:29.924637 master-0 kubenswrapper[13984]: I0312 12:32:29.924251 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-proxy-ca-bundles\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:29.924637 master-0 kubenswrapper[13984]: I0312 12:32:29.924322 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749766e0-1167-4430-918e-c419de70126d-serving-cert\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:29.924637 master-0 kubenswrapper[13984]: I0312 12:32:29.924360 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-config\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:29.924637 master-0 kubenswrapper[13984]: I0312 12:32:29.924392 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-client-ca\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:29.924637 master-0 kubenswrapper[13984]: I0312 12:32:29.924458 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6vbn\" (UniqueName: \"kubernetes.io/projected/749766e0-1167-4430-918e-c419de70126d-kube-api-access-l6vbn\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:29.925709 master-0 kubenswrapper[13984]: I0312 12:32:29.925015 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:32:29.925709 master-0 kubenswrapper[13984]: I0312 12:32:29.925321 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:32:29.925709 master-0 kubenswrapper[13984]: I0312 12:32:29.925407 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:32:29.925887 master-0 kubenswrapper[13984]: I0312 12:32:29.925754 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:32:29.929602 master-0 kubenswrapper[13984]: I0312 12:32:29.929563 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:32:29.935423 master-0 kubenswrapper[13984]: I0312 12:32:29.935381 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74559cc99b-4g2ft"] Mar 12 12:32:29.999441 master-0 kubenswrapper[13984]: I0312 12:32:29.999396 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="021b22e3-b4c5-426d-b761-181f1e54175d" path="/var/lib/kubelet/pods/021b22e3-b4c5-426d-b761-181f1e54175d/volumes" Mar 12 12:32:30.000011 master-0 kubenswrapper[13984]: I0312 12:32:29.999988 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0" path="/var/lib/kubelet/pods/ce1515a8-5e96-4b3b-b2e0-b764e5a25dd0/volumes" Mar 12 12:32:30.026197 master-0 kubenswrapper[13984]: I0312 12:32:30.026139 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-proxy-ca-bundles\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.026378 master-0 kubenswrapper[13984]: I0312 12:32:30.026213 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749766e0-1167-4430-918e-c419de70126d-serving-cert\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.026434 master-0 kubenswrapper[13984]: I0312 12:32:30.026400 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-config\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.026498 master-0 kubenswrapper[13984]: I0312 12:32:30.026444 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-client-ca\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.026536 master-0 kubenswrapper[13984]: I0312 12:32:30.026501 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6vbn\" (UniqueName: \"kubernetes.io/projected/749766e0-1167-4430-918e-c419de70126d-kube-api-access-l6vbn\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.027279 master-0 kubenswrapper[13984]: I0312 12:32:30.027261 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-proxy-ca-bundles\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.028054 master-0 kubenswrapper[13984]: I0312 12:32:30.028023 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-client-ca\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.028955 master-0 kubenswrapper[13984]: I0312 12:32:30.028910 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/749766e0-1167-4430-918e-c419de70126d-config\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.045263 master-0 kubenswrapper[13984]: I0312 12:32:30.045193 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/749766e0-1167-4430-918e-c419de70126d-serving-cert\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.046892 master-0 kubenswrapper[13984]: I0312 12:32:30.046846 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6vbn\" (UniqueName: \"kubernetes.io/projected/749766e0-1167-4430-918e-c419de70126d-kube-api-access-l6vbn\") pod \"controller-manager-74559cc99b-4g2ft\" (UID: \"749766e0-1167-4430-918e-c419de70126d\") " pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.260503 master-0 kubenswrapper[13984]: I0312 12:32:30.259794 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:30.411734 master-0 kubenswrapper[13984]: I0312 12:32:30.411689 13984 generic.go:334] "Generic (PLEG): container finished" podID="74d06933-afab-43a3-a1d3-88a569178d34" containerID="8cf7a7986d7461740b2a22c40e2265aaa2c172b0bc619290d4b6ae5354a29eb5" exitCode=0 Mar 12 12:32:30.411902 master-0 kubenswrapper[13984]: I0312 12:32:30.411751 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" event={"ID":"74d06933-afab-43a3-a1d3-88a569178d34","Type":"ContainerDied","Data":"8cf7a7986d7461740b2a22c40e2265aaa2c172b0bc619290d4b6ae5354a29eb5"} Mar 12 12:32:30.413783 master-0 kubenswrapper[13984]: I0312 12:32:30.413760 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" event={"ID":"9ebc5ac2-adf0-4e85-9685-962331cc2c11","Type":"ContainerStarted","Data":"ea3afad9ea0c2bbbd7b46c9af120e442b649be8c068852b2a29d1bd4fd9bb641"} Mar 12 12:32:30.413844 master-0 kubenswrapper[13984]: I0312 12:32:30.413785 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" event={"ID":"9ebc5ac2-adf0-4e85-9685-962331cc2c11","Type":"ContainerStarted","Data":"a484ae91b72113b275a4df800a899300cc9b9a36a0f86bb76adceb54f6d79e57"} Mar 12 12:32:30.414909 master-0 kubenswrapper[13984]: I0312 12:32:30.414870 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:30.419760 master-0 kubenswrapper[13984]: I0312 12:32:30.419719 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" Mar 12 12:32:30.478006 master-0 kubenswrapper[13984]: I0312 12:32:30.477932 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bd47cff7c-rt6qs" podStartSLOduration=3.477907126 podStartE2EDuration="3.477907126s" podCreationTimestamp="2026-03-12 12:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:30.476436383 +0000 UTC m=+482.674451875" watchObservedRunningTime="2026-03-12 12:32:30.477907126 +0000 UTC m=+482.675922618" Mar 12 12:32:30.753278 master-0 kubenswrapper[13984]: I0312 12:32:30.753232 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9rdn5"] Mar 12 12:32:30.822885 master-0 kubenswrapper[13984]: I0312 12:32:30.822708 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-74559cc99b-4g2ft"] Mar 12 12:32:31.440272 master-0 kubenswrapper[13984]: I0312 12:32:31.440226 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" event={"ID":"749766e0-1167-4430-918e-c419de70126d","Type":"ContainerStarted","Data":"a0f844830e3df42ebdcf8279c0496c3884b9d5ec85933313512cb8430c53c126"} Mar 12 12:32:31.440882 master-0 kubenswrapper[13984]: I0312 12:32:31.440839 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:31.446470 master-0 kubenswrapper[13984]: I0312 12:32:31.446348 13984 patch_prober.go:28] interesting pod/controller-manager-74559cc99b-4g2ft container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.85:8443/healthz\": dial tcp 10.128.0.85:8443: connect: connection refused" start-of-body= Mar 12 12:32:31.446470 master-0 kubenswrapper[13984]: I0312 12:32:31.446409 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" podUID="749766e0-1167-4430-918e-c419de70126d" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.85:8443/healthz\": dial tcp 10.128.0.85:8443: connect: connection refused" Mar 12 12:32:31.471678 master-0 kubenswrapper[13984]: I0312 12:32:31.471505 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" podStartSLOduration=4.4714637 podStartE2EDuration="4.4714637s" podCreationTimestamp="2026-03-12 12:32:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:31.468270597 +0000 UTC m=+483.666286099" watchObservedRunningTime="2026-03-12 12:32:31.4714637 +0000 UTC m=+483.669479192" Mar 12 12:32:32.449176 master-0 kubenswrapper[13984]: I0312 12:32:32.449110 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" event={"ID":"749766e0-1167-4430-918e-c419de70126d","Type":"ContainerStarted","Data":"a54811b963af1930c63733475c99103dcf4d14ebec176eeb52fa8a2dc7c5ac90"} Mar 12 12:32:32.454313 master-0 kubenswrapper[13984]: I0312 12:32:32.452246 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" event={"ID":"790b9d8a-c3b9-41f6-8333-342e76485a8d","Type":"ContainerStarted","Data":"786b44025e51c647ccad3704dcf4c3cc63d830daa215b6ee70798ebf09137d08"} Mar 12 12:32:32.454313 master-0 kubenswrapper[13984]: I0312 12:32:32.452379 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" gracePeriod=30 Mar 12 12:32:32.455419 master-0 kubenswrapper[13984]: I0312 12:32:32.455185 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-74559cc99b-4g2ft" Mar 12 12:32:32.571488 master-0 kubenswrapper[13984]: E0312 12:32:32.571388 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" Mar 12 12:32:33.464047 master-0 kubenswrapper[13984]: I0312 12:32:33.463937 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" event={"ID":"790b9d8a-c3b9-41f6-8333-342e76485a8d","Type":"ContainerStarted","Data":"f30c070e735e542752f0c82f37e7a20e33f914e07a7989f3fa03290289c7692a"} Mar 12 12:32:33.464047 master-0 kubenswrapper[13984]: I0312 12:32:33.464016 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" event={"ID":"790b9d8a-c3b9-41f6-8333-342e76485a8d","Type":"ContainerStarted","Data":"77d7dc8cf1033509a870b963ba63b6e9bcda3892ae7e20a193116b361c9c2358"} Mar 12 12:32:33.464047 master-0 kubenswrapper[13984]: I0312 12:32:33.464018 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:32:35.137084 master-0 kubenswrapper[13984]: I0312 12:32:35.137005 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:32:35.138318 master-0 kubenswrapper[13984]: I0312 12:32:35.138267 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:32:35.267355 master-0 kubenswrapper[13984]: I0312 12:32:35.267306 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-24x8c" Mar 12 12:32:35.275999 master-0 kubenswrapper[13984]: I0312 12:32:35.275911 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:32:35.703174 master-0 kubenswrapper[13984]: I0312 12:32:35.702566 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6d9f596674-hxrr7" podStartSLOduration=4.721739453 podStartE2EDuration="8.702546735s" podCreationTimestamp="2026-03-12 12:32:27 +0000 UTC" firstStartedPulling="2026-03-12 12:32:28.784218797 +0000 UTC m=+480.982234279" lastFinishedPulling="2026-03-12 12:32:32.765026069 +0000 UTC m=+484.963041561" observedRunningTime="2026-03-12 12:32:33.493607181 +0000 UTC m=+485.691622673" watchObservedRunningTime="2026-03-12 12:32:35.702546735 +0000 UTC m=+487.900562227" Mar 12 12:32:35.710007 master-0 kubenswrapper[13984]: I0312 12:32:35.709833 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:32:35.716744 master-0 kubenswrapper[13984]: W0312 12:32:35.716703 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41e53fa8_31cb_44a9_9411_8ce2df26b156.slice/crio-b17591b3bc628f3876e877072c2ef1407fe17291b50961427b132537d930b0e6 WatchSource:0}: Error finding container b17591b3bc628f3876e877072c2ef1407fe17291b50961427b132537d930b0e6: Status 404 returned error can't find the container with id b17591b3bc628f3876e877072c2ef1407fe17291b50961427b132537d930b0e6 Mar 12 12:32:35.980915 master-0 kubenswrapper[13984]: I0312 12:32:35.980579 13984 scope.go:117] "RemoveContainer" containerID="f0fb0da09e8c36b2b08b8d5e0695af6a8eeabc526c30a2a41b1610b3f1603e7d" Mar 12 12:32:35.980915 master-0 kubenswrapper[13984]: I0312 12:32:35.980607 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:32:35.980915 master-0 kubenswrapper[13984]: I0312 12:32:35.980636 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:32:36.013118 master-0 kubenswrapper[13984]: I0312 12:32:36.009214 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:32:36.018498 master-0 kubenswrapper[13984]: I0312 12:32:36.017756 13984 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 12 12:32:36.025860 master-0 kubenswrapper[13984]: I0312 12:32:36.025776 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:32:36.060625 master-0 kubenswrapper[13984]: I0312 12:32:36.060574 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 12 12:32:36.486212 master-0 kubenswrapper[13984]: I0312 12:32:36.486166 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273" exitCode=0 Mar 12 12:32:36.486689 master-0 kubenswrapper[13984]: I0312 12:32:36.486240 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273"} Mar 12 12:32:36.486689 master-0 kubenswrapper[13984]: I0312 12:32:36.486277 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"b17591b3bc628f3876e877072c2ef1407fe17291b50961427b132537d930b0e6"} Mar 12 12:32:36.488536 master-0 kubenswrapper[13984]: I0312 12:32:36.488407 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/3.log" Mar 12 12:32:36.489777 master-0 kubenswrapper[13984]: I0312 12:32:36.489271 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-ph7gk" event={"ID":"632651f7-6641-49d8-9c48-7f6ea5846538","Type":"ContainerStarted","Data":"375aadfd869c9ab091d08b4d2263d9889a440e69cac2686493fe9d9e27639af3"} Mar 12 12:32:36.489937 master-0 kubenswrapper[13984]: I0312 12:32:36.489869 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:32:36.489937 master-0 kubenswrapper[13984]: I0312 12:32:36.489927 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="1d70193e-f178-4448-991c-129f06a03f37" Mar 12 12:32:36.519506 master-0 kubenswrapper[13984]: I0312 12:32:36.519289 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.519263726 podStartE2EDuration="519.263726ms" podCreationTimestamp="2026-03-12 12:32:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:36.514724914 +0000 UTC m=+488.712740416" watchObservedRunningTime="2026-03-12 12:32:36.519263726 +0000 UTC m=+488.717279218" Mar 12 12:32:37.607096 master-0 kubenswrapper[13984]: E0312 12:32:37.606973 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" Mar 12 12:32:37.983518 master-0 kubenswrapper[13984]: I0312 12:32:37.983460 13984 scope.go:117] "RemoveContainer" containerID="a1e43c858ded002dc0dc3987b95a7db53fb8495157517583ae081816b81ce621" Mar 12 12:32:38.358403 master-0 kubenswrapper[13984]: E0312 12:32:38.358335 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:38.360696 master-0 kubenswrapper[13984]: E0312 12:32:38.360364 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:38.362560 master-0 kubenswrapper[13984]: E0312 12:32:38.362509 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:38.362748 master-0 kubenswrapper[13984]: E0312 12:32:38.362564 13984 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" Mar 12 12:32:38.511824 master-0 kubenswrapper[13984]: I0312 12:32:38.511778 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/4.log" Mar 12 12:32:38.512431 master-0 kubenswrapper[13984]: I0312 12:32:38.512374 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-pb97p" event={"ID":"c7d2a100-a24a-4ae6-bd8e-4530163a3ffe","Type":"ContainerStarted","Data":"b0953bac14f0b9ccd6aee5772146e53c12e02c8b9de5e0c0b91a72a4edc0fd6f"} Mar 12 12:32:38.516353 master-0 kubenswrapper[13984]: I0312 12:32:38.516280 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42"} Mar 12 12:32:38.516436 master-0 kubenswrapper[13984]: I0312 12:32:38.516314 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:32:39.525375 master-0 kubenswrapper[13984]: I0312 12:32:39.525258 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30"} Mar 12 12:32:39.525375 master-0 kubenswrapper[13984]: I0312 12:32:39.525306 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1"} Mar 12 12:32:39.525375 master-0 kubenswrapper[13984]: I0312 12:32:39.525318 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e"} Mar 12 12:32:39.525375 master-0 kubenswrapper[13984]: I0312 12:32:39.525330 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098"} Mar 12 12:32:39.525375 master-0 kubenswrapper[13984]: I0312 12:32:39.525342 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerStarted","Data":"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73"} Mar 12 12:32:39.557459 master-0 kubenswrapper[13984]: I0312 12:32:39.557366 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=375.715342833 podStartE2EDuration="6m17.557349854s" podCreationTimestamp="2026-03-12 12:26:22 +0000 UTC" firstStartedPulling="2026-03-12 12:32:36.487904602 +0000 UTC m=+488.685920124" lastFinishedPulling="2026-03-12 12:32:38.329911653 +0000 UTC m=+490.527927145" observedRunningTime="2026-03-12 12:32:39.555031876 +0000 UTC m=+491.753047378" watchObservedRunningTime="2026-03-12 12:32:39.557349854 +0000 UTC m=+491.755365346" Mar 12 12:32:40.635613 master-0 kubenswrapper[13984]: I0312 12:32:40.635561 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:32:40.636535 master-0 kubenswrapper[13984]: I0312 12:32:40.636497 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:32:40.878987 master-0 kubenswrapper[13984]: I0312 12:32:40.878946 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 12 12:32:40.880080 master-0 kubenswrapper[13984]: I0312 12:32:40.880059 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:40.882281 master-0 kubenswrapper[13984]: I0312 12:32:40.882146 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 12:32:40.882281 master-0 kubenswrapper[13984]: I0312 12:32:40.882219 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-rwh96" Mar 12 12:32:40.897537 master-0 kubenswrapper[13984]: I0312 12:32:40.897252 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 12 12:32:40.919315 master-0 kubenswrapper[13984]: I0312 12:32:40.919268 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-7vmf8" Mar 12 12:32:40.928807 master-0 kubenswrapper[13984]: I0312 12:32:40.928770 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:32:40.940795 master-0 kubenswrapper[13984]: I0312 12:32:40.940758 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34f6763f-16be-490a-b731-85ab6928e43d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:40.940961 master-0 kubenswrapper[13984]: I0312 12:32:40.940947 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:40.941051 master-0 kubenswrapper[13984]: I0312 12:32:40.941040 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-var-lock\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.042997 master-0 kubenswrapper[13984]: I0312 12:32:41.042655 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34f6763f-16be-490a-b731-85ab6928e43d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.042997 master-0 kubenswrapper[13984]: I0312 12:32:41.042713 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.042997 master-0 kubenswrapper[13984]: I0312 12:32:41.042780 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.042997 master-0 kubenswrapper[13984]: I0312 12:32:41.042778 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-var-lock\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.042997 master-0 kubenswrapper[13984]: I0312 12:32:41.042803 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-var-lock\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.057241 master-0 kubenswrapper[13984]: I0312 12:32:41.057149 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34f6763f-16be-490a-b731-85ab6928e43d-kube-api-access\") pod \"installer-3-master-0\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.211318 master-0 kubenswrapper[13984]: I0312 12:32:41.211257 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:32:41.353334 master-0 kubenswrapper[13984]: I0312 12:32:41.353287 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:32:41.362381 master-0 kubenswrapper[13984]: W0312 12:32:41.362349 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30590a1_9d92_4347_84ae_fffc821e6a57.slice/crio-06ee77275245407cd8c409a3f8c9468f1a5cbeabd28a8c022ac8ea47bcc72739 WatchSource:0}: Error finding container 06ee77275245407cd8c409a3f8c9468f1a5cbeabd28a8c022ac8ea47bcc72739: Status 404 returned error can't find the container with id 06ee77275245407cd8c409a3f8c9468f1a5cbeabd28a8c022ac8ea47bcc72739 Mar 12 12:32:41.541927 master-0 kubenswrapper[13984]: I0312 12:32:41.541861 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" exitCode=0 Mar 12 12:32:41.541927 master-0 kubenswrapper[13984]: I0312 12:32:41.541917 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} Mar 12 12:32:41.542175 master-0 kubenswrapper[13984]: I0312 12:32:41.541948 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"06ee77275245407cd8c409a3f8c9468f1a5cbeabd28a8c022ac8ea47bcc72739"} Mar 12 12:32:41.604614 master-0 kubenswrapper[13984]: I0312 12:32:41.604569 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 12 12:32:41.608100 master-0 kubenswrapper[13984]: W0312 12:32:41.608033 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod34f6763f_16be_490a_b731_85ab6928e43d.slice/crio-bcf23b5f3c4dba53f819adc922f968010ea137f2464de2430d7930708dc6198a WatchSource:0}: Error finding container bcf23b5f3c4dba53f819adc922f968010ea137f2464de2430d7930708dc6198a: Status 404 returned error can't find the container with id bcf23b5f3c4dba53f819adc922f968010ea137f2464de2430d7930708dc6198a Mar 12 12:32:42.553135 master-0 kubenswrapper[13984]: I0312 12:32:42.552524 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"34f6763f-16be-490a-b731-85ab6928e43d","Type":"ContainerStarted","Data":"2d9ea3e1573ff315bd731208fb19d3cceee712895def66ad5d74e7c3ab5292de"} Mar 12 12:32:42.553135 master-0 kubenswrapper[13984]: I0312 12:32:42.552588 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"34f6763f-16be-490a-b731-85ab6928e43d","Type":"ContainerStarted","Data":"bcf23b5f3c4dba53f819adc922f968010ea137f2464de2430d7930708dc6198a"} Mar 12 12:32:42.573107 master-0 kubenswrapper[13984]: I0312 12:32:42.573027 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.573004478 podStartE2EDuration="2.573004478s" podCreationTimestamp="2026-03-12 12:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:42.568418274 +0000 UTC m=+494.766433796" watchObservedRunningTime="2026-03-12 12:32:42.573004478 +0000 UTC m=+494.771019970" Mar 12 12:32:45.575149 master-0 kubenswrapper[13984]: I0312 12:32:45.575081 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} Mar 12 12:32:45.575149 master-0 kubenswrapper[13984]: I0312 12:32:45.575138 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} Mar 12 12:32:45.575149 master-0 kubenswrapper[13984]: I0312 12:32:45.575162 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} Mar 12 12:32:46.586688 master-0 kubenswrapper[13984]: I0312 12:32:46.586613 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} Mar 12 12:32:46.586688 master-0 kubenswrapper[13984]: I0312 12:32:46.586685 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} Mar 12 12:32:46.586688 master-0 kubenswrapper[13984]: I0312 12:32:46.586704 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerStarted","Data":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} Mar 12 12:32:46.632073 master-0 kubenswrapper[13984]: I0312 12:32:46.631968 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=375.190351039 podStartE2EDuration="6m18.631946606s" podCreationTimestamp="2026-03-12 12:26:28 +0000 UTC" firstStartedPulling="2026-03-12 12:32:41.543774443 +0000 UTC m=+493.741789945" lastFinishedPulling="2026-03-12 12:32:44.98537003 +0000 UTC m=+497.183385512" observedRunningTime="2026-03-12 12:32:46.63069134 +0000 UTC m=+498.828706902" watchObservedRunningTime="2026-03-12 12:32:46.631946606 +0000 UTC m=+498.829962118" Mar 12 12:32:48.359258 master-0 kubenswrapper[13984]: E0312 12:32:48.359184 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:48.361723 master-0 kubenswrapper[13984]: E0312 12:32:48.361618 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:48.363169 master-0 kubenswrapper[13984]: E0312 12:32:48.363123 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:48.363250 master-0 kubenswrapper[13984]: E0312 12:32:48.363177 13984 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" Mar 12 12:32:50.929666 master-0 kubenswrapper[13984]: I0312 12:32:50.929591 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:32:51.373784 master-0 kubenswrapper[13984]: I0312 12:32:51.373682 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:32:51.373784 master-0 kubenswrapper[13984]: I0312 12:32:51.373741 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:32:51.374247 master-0 kubenswrapper[13984]: E0312 12:32:51.373919 13984 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:32:51.374247 master-0 kubenswrapper[13984]: E0312 12:32:51.373942 13984 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:32:51.374247 master-0 kubenswrapper[13984]: E0312 12:32:51.373997 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access podName:48e7be9a-921a-42b0-b9ae-b7ffd28c89a4 nodeName:}" failed. No retries permitted until 2026-03-12 12:34:53.373979891 +0000 UTC m=+625.571995473 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access") pod "installer-1-master-0" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 12 12:32:51.381574 master-0 kubenswrapper[13984]: I0312 12:32:51.378456 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 12 12:32:51.412724 master-0 kubenswrapper[13984]: I0312 12:32:51.412651 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 12 12:32:51.413048 master-0 kubenswrapper[13984]: I0312 12:32:51.412994 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://0d50ef84a902d60b2d7d410974b24e4f48e1a63818f16207d507864a9e96ea0a" gracePeriod=30 Mar 12 12:32:51.414818 master-0 kubenswrapper[13984]: I0312 12:32:51.414426 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 12 12:32:51.415088 master-0 kubenswrapper[13984]: E0312 12:32:51.415039 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415151 master-0 kubenswrapper[13984]: I0312 12:32:51.415084 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415151 master-0 kubenswrapper[13984]: E0312 12:32:51.415130 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415233 master-0 kubenswrapper[13984]: I0312 12:32:51.415149 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415233 master-0 kubenswrapper[13984]: E0312 12:32:51.415206 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415233 master-0 kubenswrapper[13984]: I0312 12:32:51.415226 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415537 master-0 kubenswrapper[13984]: I0312 12:32:51.415512 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415598 master-0 kubenswrapper[13984]: I0312 12:32:51.415542 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415641 master-0 kubenswrapper[13984]: I0312 12:32:51.415616 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415871 master-0 kubenswrapper[13984]: E0312 12:32:51.415836 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.415871 master-0 kubenswrapper[13984]: I0312 12:32:51.415862 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.416213 master-0 kubenswrapper[13984]: I0312 12:32:51.416176 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 12 12:32:51.420339 master-0 kubenswrapper[13984]: I0312 12:32:51.420296 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.578403 master-0 kubenswrapper[13984]: I0312 12:32:51.578327 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") pod \"a97fcd56-aa52-414a-b370-154c1b34c1ed\" (UID: \"a97fcd56-aa52-414a-b370-154c1b34c1ed\") " Mar 12 12:32:51.578960 master-0 kubenswrapper[13984]: I0312 12:32:51.578909 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.579092 master-0 kubenswrapper[13984]: I0312 12:32:51.579056 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.579654 master-0 kubenswrapper[13984]: I0312 12:32:51.579616 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 12 12:32:51.584065 master-0 kubenswrapper[13984]: I0312 12:32:51.583992 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a97fcd56-aa52-414a-b370-154c1b34c1ed" (UID: "a97fcd56-aa52-414a-b370-154c1b34c1ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:32:51.652635 master-0 kubenswrapper[13984]: I0312 12:32:51.652586 13984 generic.go:334] "Generic (PLEG): container finished" podID="950b563e-a63d-4f3b-b179-f4a2f071739f" containerID="f2eacdd07a16a2435dfc8f871501cf98e3c9bcb25686aed1ec3614fe561e3f1b" exitCode=0 Mar 12 12:32:51.652818 master-0 kubenswrapper[13984]: I0312 12:32:51.652654 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"950b563e-a63d-4f3b-b179-f4a2f071739f","Type":"ContainerDied","Data":"f2eacdd07a16a2435dfc8f871501cf98e3c9bcb25686aed1ec3614fe561e3f1b"} Mar 12 12:32:51.655630 master-0 kubenswrapper[13984]: I0312 12:32:51.655574 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="0d50ef84a902d60b2d7d410974b24e4f48e1a63818f16207d507864a9e96ea0a" exitCode=0 Mar 12 12:32:51.655748 master-0 kubenswrapper[13984]: I0312 12:32:51.655653 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71131571f78d4c3ee8f3ce7c12f7ecb51b5e096152ca03f7baeed83f355647d3" Mar 12 12:32:51.655748 master-0 kubenswrapper[13984]: I0312 12:32:51.655678 13984 scope.go:117] "RemoveContainer" containerID="a9d9b5f96bde28a030172fa8b8562f0ad2738118cc137cd6bc087cf4fbc7972f" Mar 12 12:32:51.679954 master-0 kubenswrapper[13984]: I0312 12:32:51.679898 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.680200 master-0 kubenswrapper[13984]: I0312 12:32:51.680014 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.680200 master-0 kubenswrapper[13984]: I0312 12:32:51.680103 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.680301 master-0 kubenswrapper[13984]: I0312 12:32:51.680213 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a97fcd56-aa52-414a-b370-154c1b34c1ed-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:51.680301 master-0 kubenswrapper[13984]: I0312 12:32:51.680257 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.719344 master-0 kubenswrapper[13984]: I0312 12:32:51.719288 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:32:51.740779 master-0 kubenswrapper[13984]: I0312 12:32:51.740714 13984 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="646805e6-ae67-45d8-b506-8b782af86ffe" Mar 12 12:32:51.863629 master-0 kubenswrapper[13984]: I0312 12:32:51.863575 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:51.881169 master-0 kubenswrapper[13984]: W0312 12:32:51.881078 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1453f6461bf5d599ad65a4656343ee91.slice/crio-2598018ed51cf232e4f72c219568566d31cb1b1c90b980edd8aa76acf1ffafd9 WatchSource:0}: Error finding container 2598018ed51cf232e4f72c219568566d31cb1b1c90b980edd8aa76acf1ffafd9: Status 404 returned error can't find the container with id 2598018ed51cf232e4f72c219568566d31cb1b1c90b980edd8aa76acf1ffafd9 Mar 12 12:32:51.883267 master-0 kubenswrapper[13984]: I0312 12:32:51.883221 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 12 12:32:51.883597 master-0 kubenswrapper[13984]: I0312 12:32:51.883335 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 12 12:32:51.883809 master-0 kubenswrapper[13984]: I0312 12:32:51.883329 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:32:51.883873 master-0 kubenswrapper[13984]: I0312 12:32:51.883358 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:32:51.884600 master-0 kubenswrapper[13984]: I0312 12:32:51.884555 13984 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:51.884600 master-0 kubenswrapper[13984]: I0312 12:32:51.884584 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:51.996172 master-0 kubenswrapper[13984]: I0312 12:32:51.996013 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 12 12:32:51.996792 master-0 kubenswrapper[13984]: I0312 12:32:51.996493 13984 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 12 12:32:52.017486 master-0 kubenswrapper[13984]: I0312 12:32:52.017419 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 12 12:32:52.017486 master-0 kubenswrapper[13984]: I0312 12:32:52.017455 13984 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="646805e6-ae67-45d8-b506-8b782af86ffe" Mar 12 12:32:52.021579 master-0 kubenswrapper[13984]: I0312 12:32:52.021449 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 12 12:32:52.021691 master-0 kubenswrapper[13984]: I0312 12:32:52.021577 13984 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="646805e6-ae67-45d8-b506-8b782af86ffe" Mar 12 12:32:52.669031 master-0 kubenswrapper[13984]: I0312 12:32:52.668943 13984 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="7b04278396a739719fe384c1c1886c0dcc740e5de0eac8e6d5a7531d3306190a" exitCode=0 Mar 12 12:32:52.669329 master-0 kubenswrapper[13984]: I0312 12:32:52.669031 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"7b04278396a739719fe384c1c1886c0dcc740e5de0eac8e6d5a7531d3306190a"} Mar 12 12:32:52.669329 master-0 kubenswrapper[13984]: I0312 12:32:52.669096 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"2598018ed51cf232e4f72c219568566d31cb1b1c90b980edd8aa76acf1ffafd9"} Mar 12 12:32:52.674318 master-0 kubenswrapper[13984]: I0312 12:32:52.674195 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 12 12:32:53.229129 master-0 kubenswrapper[13984]: I0312 12:32:53.228942 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:53.405134 master-0 kubenswrapper[13984]: I0312 12:32:53.405092 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-var-lock\") pod \"950b563e-a63d-4f3b-b179-f4a2f071739f\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " Mar 12 12:32:53.405336 master-0 kubenswrapper[13984]: I0312 12:32:53.405153 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/950b563e-a63d-4f3b-b179-f4a2f071739f-kube-api-access\") pod \"950b563e-a63d-4f3b-b179-f4a2f071739f\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " Mar 12 12:32:53.405336 master-0 kubenswrapper[13984]: I0312 12:32:53.405213 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-kubelet-dir\") pod \"950b563e-a63d-4f3b-b179-f4a2f071739f\" (UID: \"950b563e-a63d-4f3b-b179-f4a2f071739f\") " Mar 12 12:32:53.405464 master-0 kubenswrapper[13984]: I0312 12:32:53.405442 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "950b563e-a63d-4f3b-b179-f4a2f071739f" (UID: "950b563e-a63d-4f3b-b179-f4a2f071739f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:32:53.405525 master-0 kubenswrapper[13984]: I0312 12:32:53.405458 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-var-lock" (OuterVolumeSpecName: "var-lock") pod "950b563e-a63d-4f3b-b179-f4a2f071739f" (UID: "950b563e-a63d-4f3b-b179-f4a2f071739f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:32:53.410191 master-0 kubenswrapper[13984]: I0312 12:32:53.409151 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/950b563e-a63d-4f3b-b179-f4a2f071739f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "950b563e-a63d-4f3b-b179-f4a2f071739f" (UID: "950b563e-a63d-4f3b-b179-f4a2f071739f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:32:53.506215 master-0 kubenswrapper[13984]: I0312 12:32:53.506169 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/950b563e-a63d-4f3b-b179-f4a2f071739f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:53.506215 master-0 kubenswrapper[13984]: I0312 12:32:53.506207 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:53.506215 master-0 kubenswrapper[13984]: I0312 12:32:53.506218 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/950b563e-a63d-4f3b-b179-f4a2f071739f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:32:53.684018 master-0 kubenswrapper[13984]: I0312 12:32:53.683962 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"950b563e-a63d-4f3b-b179-f4a2f071739f","Type":"ContainerDied","Data":"78a7d4ac3a18f185f9cbbfda2f2311015baaf728740462b3b347f4063e780ec7"} Mar 12 12:32:53.684018 master-0 kubenswrapper[13984]: I0312 12:32:53.684012 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78a7d4ac3a18f185f9cbbfda2f2311015baaf728740462b3b347f4063e780ec7" Mar 12 12:32:53.684363 master-0 kubenswrapper[13984]: I0312 12:32:53.684038 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 12 12:32:53.702451 master-0 kubenswrapper[13984]: I0312 12:32:53.702388 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"4b749553c2f8e178cf4dc0db507c1ae7b44a26461cdd8dd900a40358195790bc"} Mar 12 12:32:53.702451 master-0 kubenswrapper[13984]: I0312 12:32:53.702436 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"d751ec101ed55b50a7f5098f4f2783da9338892aeebc9f5e290dbe42e9c28f33"} Mar 12 12:32:53.702451 master-0 kubenswrapper[13984]: I0312 12:32:53.702447 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"e4e132145eff2807dbde0d6fa8cb6326225f4d51a2f372f5d623052b7f42bd81"} Mar 12 12:32:53.726996 master-0 kubenswrapper[13984]: I0312 12:32:53.726912 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.726892882 podStartE2EDuration="2.726892882s" podCreationTimestamp="2026-03-12 12:32:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:32:53.725407668 +0000 UTC m=+505.923423160" watchObservedRunningTime="2026-03-12 12:32:53.726892882 +0000 UTC m=+505.924908374" Mar 12 12:32:54.719166 master-0 kubenswrapper[13984]: I0312 12:32:54.719090 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:32:58.358776 master-0 kubenswrapper[13984]: E0312 12:32:58.358688 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:58.361828 master-0 kubenswrapper[13984]: E0312 12:32:58.361786 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:58.363302 master-0 kubenswrapper[13984]: E0312 12:32:58.363262 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 12 12:32:58.363481 master-0 kubenswrapper[13984]: E0312 12:32:58.363441 13984 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" Mar 12 12:32:59.757304 master-0 kubenswrapper[13984]: I0312 12:32:59.757192 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-xpzn2_74d06933-afab-43a3-a1d3-88a569178d34/multus-admission-controller/0.log" Mar 12 12:32:59.757304 master-0 kubenswrapper[13984]: I0312 12:32:59.757249 13984 generic.go:334] "Generic (PLEG): container finished" podID="74d06933-afab-43a3-a1d3-88a569178d34" containerID="e8a0643d2fccb7a3a8b1a9999fe3ff756f073c2a63121cc490f340f51eaf7001" exitCode=137 Mar 12 12:32:59.757304 master-0 kubenswrapper[13984]: I0312 12:32:59.757279 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" event={"ID":"74d06933-afab-43a3-a1d3-88a569178d34","Type":"ContainerDied","Data":"e8a0643d2fccb7a3a8b1a9999fe3ff756f073c2a63121cc490f340f51eaf7001"} Mar 12 12:33:00.580611 master-0 kubenswrapper[13984]: I0312 12:33:00.580109 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-xpzn2_74d06933-afab-43a3-a1d3-88a569178d34/multus-admission-controller/0.log" Mar 12 12:33:00.580611 master-0 kubenswrapper[13984]: I0312 12:33:00.580201 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:33:00.714562 master-0 kubenswrapper[13984]: I0312 12:33:00.714517 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") pod \"74d06933-afab-43a3-a1d3-88a569178d34\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " Mar 12 12:33:00.715045 master-0 kubenswrapper[13984]: I0312 12:33:00.715019 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") pod \"74d06933-afab-43a3-a1d3-88a569178d34\" (UID: \"74d06933-afab-43a3-a1d3-88a569178d34\") " Mar 12 12:33:00.717268 master-0 kubenswrapper[13984]: I0312 12:33:00.717214 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "74d06933-afab-43a3-a1d3-88a569178d34" (UID: "74d06933-afab-43a3-a1d3-88a569178d34"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:33:00.718018 master-0 kubenswrapper[13984]: I0312 12:33:00.717981 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq" (OuterVolumeSpecName: "kube-api-access-jcltq") pod "74d06933-afab-43a3-a1d3-88a569178d34" (UID: "74d06933-afab-43a3-a1d3-88a569178d34"). InnerVolumeSpecName "kube-api-access-jcltq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:33:00.765939 master-0 kubenswrapper[13984]: I0312 12:33:00.765906 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-xpzn2_74d06933-afab-43a3-a1d3-88a569178d34/multus-admission-controller/0.log" Mar 12 12:33:00.766486 master-0 kubenswrapper[13984]: I0312 12:33:00.766457 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" event={"ID":"74d06933-afab-43a3-a1d3-88a569178d34","Type":"ContainerDied","Data":"91dfd404945dfa72b72f5ee55329cfd84189e41ee3e0a4c889c8d9f86c69e940"} Mar 12 12:33:00.766631 master-0 kubenswrapper[13984]: I0312 12:33:00.766551 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-xpzn2" Mar 12 12:33:00.766762 master-0 kubenswrapper[13984]: I0312 12:33:00.766603 13984 scope.go:117] "RemoveContainer" containerID="8cf7a7986d7461740b2a22c40e2265aaa2c172b0bc619290d4b6ae5354a29eb5" Mar 12 12:33:00.781870 master-0 kubenswrapper[13984]: I0312 12:33:00.781827 13984 scope.go:117] "RemoveContainer" containerID="e8a0643d2fccb7a3a8b1a9999fe3ff756f073c2a63121cc490f340f51eaf7001" Mar 12 12:33:00.816813 master-0 kubenswrapper[13984]: I0312 12:33:00.816782 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcltq\" (UniqueName: \"kubernetes.io/projected/74d06933-afab-43a3-a1d3-88a569178d34-kube-api-access-jcltq\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:00.816813 master-0 kubenswrapper[13984]: I0312 12:33:00.816813 13984 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/74d06933-afab-43a3-a1d3-88a569178d34-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:00.830509 master-0 kubenswrapper[13984]: I0312 12:33:00.830451 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-xpzn2"] Mar 12 12:33:00.850907 master-0 kubenswrapper[13984]: I0312 12:33:00.850843 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-xpzn2"] Mar 12 12:33:01.990422 master-0 kubenswrapper[13984]: I0312 12:33:01.990378 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74d06933-afab-43a3-a1d3-88a569178d34" path="/var/lib/kubelet/pods/74d06933-afab-43a3-a1d3-88a569178d34/volumes" Mar 12 12:33:02.571197 master-0 kubenswrapper[13984]: I0312 12:33:02.570815 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-9rdn5_eb40fb4f-b815-4850-b446-7bd3aae4c637/kube-multus-additional-cni-plugins/0.log" Mar 12 12:33:02.571197 master-0 kubenswrapper[13984]: I0312 12:33:02.570880 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:33:02.745237 master-0 kubenswrapper[13984]: I0312 12:33:02.745080 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/eb40fb4f-b815-4850-b446-7bd3aae4c637-ready\") pod \"eb40fb4f-b815-4850-b446-7bd3aae4c637\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " Mar 12 12:33:02.745237 master-0 kubenswrapper[13984]: I0312 12:33:02.745151 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsf7w\" (UniqueName: \"kubernetes.io/projected/eb40fb4f-b815-4850-b446-7bd3aae4c637-kube-api-access-fsf7w\") pod \"eb40fb4f-b815-4850-b446-7bd3aae4c637\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " Mar 12 12:33:02.745237 master-0 kubenswrapper[13984]: I0312 12:33:02.745176 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb40fb4f-b815-4850-b446-7bd3aae4c637-cni-sysctl-allowlist\") pod \"eb40fb4f-b815-4850-b446-7bd3aae4c637\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " Mar 12 12:33:02.745633 master-0 kubenswrapper[13984]: I0312 12:33:02.745284 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb40fb4f-b815-4850-b446-7bd3aae4c637-tuning-conf-dir\") pod \"eb40fb4f-b815-4850-b446-7bd3aae4c637\" (UID: \"eb40fb4f-b815-4850-b446-7bd3aae4c637\") " Mar 12 12:33:02.745705 master-0 kubenswrapper[13984]: I0312 12:33:02.745635 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb40fb4f-b815-4850-b446-7bd3aae4c637-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "eb40fb4f-b815-4850-b446-7bd3aae4c637" (UID: "eb40fb4f-b815-4850-b446-7bd3aae4c637"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:33:02.745941 master-0 kubenswrapper[13984]: I0312 12:33:02.745911 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb40fb4f-b815-4850-b446-7bd3aae4c637-ready" (OuterVolumeSpecName: "ready") pod "eb40fb4f-b815-4850-b446-7bd3aae4c637" (UID: "eb40fb4f-b815-4850-b446-7bd3aae4c637"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:33:02.746603 master-0 kubenswrapper[13984]: I0312 12:33:02.746555 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb40fb4f-b815-4850-b446-7bd3aae4c637-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "eb40fb4f-b815-4850-b446-7bd3aae4c637" (UID: "eb40fb4f-b815-4850-b446-7bd3aae4c637"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:33:02.750031 master-0 kubenswrapper[13984]: I0312 12:33:02.749935 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb40fb4f-b815-4850-b446-7bd3aae4c637-kube-api-access-fsf7w" (OuterVolumeSpecName: "kube-api-access-fsf7w") pod "eb40fb4f-b815-4850-b446-7bd3aae4c637" (UID: "eb40fb4f-b815-4850-b446-7bd3aae4c637"). InnerVolumeSpecName "kube-api-access-fsf7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:33:02.780705 master-0 kubenswrapper[13984]: I0312 12:33:02.780644 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-9rdn5_eb40fb4f-b815-4850-b446-7bd3aae4c637/kube-multus-additional-cni-plugins/0.log" Mar 12 12:33:02.780705 master-0 kubenswrapper[13984]: I0312 12:33:02.780702 13984 generic.go:334] "Generic (PLEG): container finished" podID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" exitCode=137 Mar 12 12:33:02.781100 master-0 kubenswrapper[13984]: I0312 12:33:02.780735 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" event={"ID":"eb40fb4f-b815-4850-b446-7bd3aae4c637","Type":"ContainerDied","Data":"2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913"} Mar 12 12:33:02.781100 master-0 kubenswrapper[13984]: I0312 12:33:02.780764 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" event={"ID":"eb40fb4f-b815-4850-b446-7bd3aae4c637","Type":"ContainerDied","Data":"0caf853d896ff411401c6bbbffb9603c1b1eebda4c622f846ea99046904d7425"} Mar 12 12:33:02.781100 master-0 kubenswrapper[13984]: I0312 12:33:02.780783 13984 scope.go:117] "RemoveContainer" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" Mar 12 12:33:02.781100 master-0 kubenswrapper[13984]: I0312 12:33:02.780790 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-9rdn5" Mar 12 12:33:02.800860 master-0 kubenswrapper[13984]: I0312 12:33:02.800839 13984 scope.go:117] "RemoveContainer" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" Mar 12 12:33:02.801449 master-0 kubenswrapper[13984]: E0312 12:33:02.801411 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913\": container with ID starting with 2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913 not found: ID does not exist" containerID="2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913" Mar 12 12:33:02.801625 master-0 kubenswrapper[13984]: I0312 12:33:02.801460 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913"} err="failed to get container status \"2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913\": rpc error: code = NotFound desc = could not find container \"2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913\": container with ID starting with 2202b3ca46cd4fc4f0c4e285f6592b5824190baafafbf8cdc4c5830d172d7913 not found: ID does not exist" Mar 12 12:33:02.828904 master-0 kubenswrapper[13984]: I0312 12:33:02.828850 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9rdn5"] Mar 12 12:33:02.836670 master-0 kubenswrapper[13984]: I0312 12:33:02.836519 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-9rdn5"] Mar 12 12:33:02.847086 master-0 kubenswrapper[13984]: I0312 12:33:02.847022 13984 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/eb40fb4f-b815-4850-b446-7bd3aae4c637-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:02.847086 master-0 kubenswrapper[13984]: I0312 12:33:02.847076 13984 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/eb40fb4f-b815-4850-b446-7bd3aae4c637-ready\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:02.847334 master-0 kubenswrapper[13984]: I0312 12:33:02.847109 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsf7w\" (UniqueName: \"kubernetes.io/projected/eb40fb4f-b815-4850-b446-7bd3aae4c637-kube-api-access-fsf7w\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:02.847334 master-0 kubenswrapper[13984]: I0312 12:33:02.847135 13984 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/eb40fb4f-b815-4850-b446-7bd3aae4c637-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:03.987207 master-0 kubenswrapper[13984]: I0312 12:33:03.987137 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" path="/var/lib/kubelet/pods/eb40fb4f-b815-4850-b446-7bd3aae4c637/volumes" Mar 12 12:33:08.890149 master-0 kubenswrapper[13984]: E0312 12:33:08.890070 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" podUID="c424f946-e2fe-4450-816b-b79640269ff5" Mar 12 12:33:09.830029 master-0 kubenswrapper[13984]: I0312 12:33:09.829965 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:33:10.283926 master-0 kubenswrapper[13984]: I0312 12:33:10.283843 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:33:10.285747 master-0 kubenswrapper[13984]: I0312 12:33:10.285684 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c424f946-e2fe-4450-816b-b79640269ff5-trusted-ca\") pod \"console-operator-6c7fb6b958-rnnjn\" (UID: \"c424f946-e2fe-4450-816b-b79640269ff5\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:33:10.431716 master-0 kubenswrapper[13984]: I0312 12:33:10.431565 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:33:10.940051 master-0 kubenswrapper[13984]: I0312 12:33:10.939965 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-rnnjn"] Mar 12 12:33:10.948325 master-0 kubenswrapper[13984]: W0312 12:33:10.948242 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc424f946_e2fe_4450_816b_b79640269ff5.slice/crio-1026a7bd53f44771fc1f2a6ac5b6d2ad42720d885be05ec84077b7fdf374a6cf WatchSource:0}: Error finding container 1026a7bd53f44771fc1f2a6ac5b6d2ad42720d885be05ec84077b7fdf374a6cf: Status 404 returned error can't find the container with id 1026a7bd53f44771fc1f2a6ac5b6d2ad42720d885be05ec84077b7fdf374a6cf Mar 12 12:33:11.861038 master-0 kubenswrapper[13984]: I0312 12:33:11.860966 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" event={"ID":"c424f946-e2fe-4450-816b-b79640269ff5","Type":"ContainerStarted","Data":"1026a7bd53f44771fc1f2a6ac5b6d2ad42720d885be05ec84077b7fdf374a6cf"} Mar 12 12:33:13.887083 master-0 kubenswrapper[13984]: I0312 12:33:13.887008 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" event={"ID":"c424f946-e2fe-4450-816b-b79640269ff5","Type":"ContainerStarted","Data":"c18ff40bb32908df509e6e0b7438b0bfee4f3a9a5f360c5b805564bc15fdfee8"} Mar 12 12:33:13.887555 master-0 kubenswrapper[13984]: I0312 12:33:13.887368 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:33:13.889680 master-0 kubenswrapper[13984]: I0312 12:33:13.889597 13984 patch_prober.go:28] interesting pod/console-operator-6c7fb6b958-rnnjn container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.128.0.61:8443/readyz\": dial tcp 10.128.0.61:8443: connect: connection refused" start-of-body= Mar 12 12:33:13.889841 master-0 kubenswrapper[13984]: I0312 12:33:13.889695 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" podUID="c424f946-e2fe-4450-816b-b79640269ff5" containerName="console-operator" probeResult="failure" output="Get \"https://10.128.0.61:8443/readyz\": dial tcp 10.128.0.61:8443: connect: connection refused" Mar 12 12:33:13.914275 master-0 kubenswrapper[13984]: I0312 12:33:13.914109 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" podStartSLOduration=495.216602348 podStartE2EDuration="8m17.914072609s" podCreationTimestamp="2026-03-12 12:24:56 +0000 UTC" firstStartedPulling="2026-03-12 12:33:10.950540024 +0000 UTC m=+523.148555516" lastFinishedPulling="2026-03-12 12:33:13.648010285 +0000 UTC m=+525.846025777" observedRunningTime="2026-03-12 12:33:13.912017159 +0000 UTC m=+526.110032671" watchObservedRunningTime="2026-03-12 12:33:13.914072609 +0000 UTC m=+526.112088121" Mar 12 12:33:14.390724 master-0 kubenswrapper[13984]: I0312 12:33:14.390656 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:33:14.391098 master-0 kubenswrapper[13984]: I0312 12:33:14.391027 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://7405ab984b4b29bcfc8c9051e359b9cf1d8a4e58dd56cb0807b7d8e784ee3072" gracePeriod=30 Mar 12 12:33:14.391098 master-0 kubenswrapper[13984]: I0312 12:33:14.391083 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" containerID="cri-o://37ce22b830e08d23df66196c50da46b1d74b7f16a0a3129542989b5d4bdc3dd3" gracePeriod=30 Mar 12 12:33:14.391271 master-0 kubenswrapper[13984]: I0312 12:33:14.391110 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" containerID="cri-o://765bad2b5a42ad204d381d1e7b6c234c53c95f09dcfc3c8f5a96103e042780ef" gracePeriod=30 Mar 12 12:33:14.391271 master-0 kubenswrapper[13984]: E0312 12:33:14.391081 13984 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-controller-manager-pod.yaml\": /etc/kubernetes/manifests/kube-controller-manager-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 12 12:33:14.391271 master-0 kubenswrapper[13984]: I0312 12:33:14.391110 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://a8fedfcde448bdde81261d7ac4ac60d9af5f9d81f27ef839a88514a7008192a6" gracePeriod=30 Mar 12 12:33:14.392887 master-0 kubenswrapper[13984]: I0312 12:33:14.392825 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:33:14.393150 master-0 kubenswrapper[13984]: E0312 12:33:14.393113 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393150 master-0 kubenswrapper[13984]: I0312 12:33:14.393139 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393150 master-0 kubenswrapper[13984]: E0312 12:33:14.393154 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="950b563e-a63d-4f3b-b179-f4a2f071739f" containerName="installer" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393166 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="950b563e-a63d-4f3b-b179-f4a2f071739f" containerName="installer" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393189 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393203 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393227 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="kube-rbac-proxy" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393238 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="kube-rbac-proxy" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393260 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-recovery-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393272 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-recovery-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393288 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393299 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393316 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393324 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393339 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="multus-admission-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393347 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="multus-admission-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393362 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393370 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393379 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-cert-syncer" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393390 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-cert-syncer" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393408 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393417 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: E0312 12:33:14.393431 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" Mar 12 12:33:14.393425 master-0 kubenswrapper[13984]: I0312 12:33:14.393439 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393648 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="950b563e-a63d-4f3b-b179-f4a2f071739f" containerName="installer" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393683 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393703 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393725 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393739 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-cert-syncer" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393774 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="kube-rbac-proxy" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393800 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="74d06933-afab-43a3-a1d3-88a569178d34" containerName="multus-admission-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393818 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb40fb4f-b815-4850-b446-7bd3aae4c637" containerName="kube-multus-additional-cni-plugins" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393834 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.393848 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager-recovery-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: E0312 12:33:14.394056 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.394072 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.394324 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.394350 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="kube-controller-manager" Mar 12 12:33:14.394587 master-0 kubenswrapper[13984]: I0312 12:33:14.394368 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="161fce36d846c7ce98305d8ed6c23827" containerName="cluster-policy-controller" Mar 12 12:33:14.444338 master-0 kubenswrapper[13984]: I0312 12:33:14.444269 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.444537 master-0 kubenswrapper[13984]: I0312 12:33:14.444437 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.546014 master-0 kubenswrapper[13984]: I0312 12:33:14.545950 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.546219 master-0 kubenswrapper[13984]: I0312 12:33:14.546057 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.546219 master-0 kubenswrapper[13984]: I0312 12:33:14.546076 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.546219 master-0 kubenswrapper[13984]: I0312 12:33:14.546171 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.586614 master-0 kubenswrapper[13984]: I0312 12:33:14.586581 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/3.log" Mar 12 12:33:14.587910 master-0 kubenswrapper[13984]: I0312 12:33:14.587884 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager-cert-syncer/0.log" Mar 12 12:33:14.588320 master-0 kubenswrapper[13984]: I0312 12:33:14.588299 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:33:14.588419 master-0 kubenswrapper[13984]: I0312 12:33:14.588408 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.591919 master-0 kubenswrapper[13984]: I0312 12:33:14.591843 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="161fce36d846c7ce98305d8ed6c23827" podUID="d2a6a89fd8fe0c6f59a2124101057324" Mar 12 12:33:14.748808 master-0 kubenswrapper[13984]: I0312 12:33:14.748754 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") pod \"161fce36d846c7ce98305d8ed6c23827\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " Mar 12 12:33:14.749056 master-0 kubenswrapper[13984]: I0312 12:33:14.748896 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") pod \"161fce36d846c7ce98305d8ed6c23827\" (UID: \"161fce36d846c7ce98305d8ed6c23827\") " Mar 12 12:33:14.749056 master-0 kubenswrapper[13984]: I0312 12:33:14.748921 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "161fce36d846c7ce98305d8ed6c23827" (UID: "161fce36d846c7ce98305d8ed6c23827"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:33:14.749056 master-0 kubenswrapper[13984]: I0312 12:33:14.748998 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "161fce36d846c7ce98305d8ed6c23827" (UID: "161fce36d846c7ce98305d8ed6c23827"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:33:14.749250 master-0 kubenswrapper[13984]: I0312 12:33:14.749225 13984 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:14.749250 master-0 kubenswrapper[13984]: I0312 12:33:14.749244 13984 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/161fce36d846c7ce98305d8ed6c23827-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:14.897289 master-0 kubenswrapper[13984]: I0312 12:33:14.897236 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/cluster-policy-controller/3.log" Mar 12 12:33:14.898714 master-0 kubenswrapper[13984]: I0312 12:33:14.898686 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager-cert-syncer/0.log" Mar 12 12:33:14.899307 master-0 kubenswrapper[13984]: I0312 12:33:14.899277 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager/0.log" Mar 12 12:33:14.899342 master-0 kubenswrapper[13984]: I0312 12:33:14.899328 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="37ce22b830e08d23df66196c50da46b1d74b7f16a0a3129542989b5d4bdc3dd3" exitCode=0 Mar 12 12:33:14.899370 master-0 kubenswrapper[13984]: I0312 12:33:14.899345 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="765bad2b5a42ad204d381d1e7b6c234c53c95f09dcfc3c8f5a96103e042780ef" exitCode=0 Mar 12 12:33:14.899370 master-0 kubenswrapper[13984]: I0312 12:33:14.899356 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="a8fedfcde448bdde81261d7ac4ac60d9af5f9d81f27ef839a88514a7008192a6" exitCode=0 Mar 12 12:33:14.899370 master-0 kubenswrapper[13984]: I0312 12:33:14.899366 13984 generic.go:334] "Generic (PLEG): container finished" podID="161fce36d846c7ce98305d8ed6c23827" containerID="7405ab984b4b29bcfc8c9051e359b9cf1d8a4e58dd56cb0807b7d8e784ee3072" exitCode=2 Mar 12 12:33:14.899537 master-0 kubenswrapper[13984]: I0312 12:33:14.899400 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:14.899537 master-0 kubenswrapper[13984]: I0312 12:33:14.899432 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e568f13a07418417de76bfa78410ad7d24450d5005cdba1993cc259dbab22c6" Mar 12 12:33:14.899537 master-0 kubenswrapper[13984]: I0312 12:33:14.899449 13984 scope.go:117] "RemoveContainer" containerID="ce8e150f5723be396e61f3cc061f8bb188c4c0e1fb00142a945286d4c46a00c4" Mar 12 12:33:14.903018 master-0 kubenswrapper[13984]: I0312 12:33:14.902974 13984 generic.go:334] "Generic (PLEG): container finished" podID="34f6763f-16be-490a-b731-85ab6928e43d" containerID="2d9ea3e1573ff315bd731208fb19d3cceee712895def66ad5d74e7c3ab5292de" exitCode=0 Mar 12 12:33:14.903153 master-0 kubenswrapper[13984]: I0312 12:33:14.903094 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="161fce36d846c7ce98305d8ed6c23827" podUID="d2a6a89fd8fe0c6f59a2124101057324" Mar 12 12:33:14.903227 master-0 kubenswrapper[13984]: I0312 12:33:14.903175 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"34f6763f-16be-490a-b731-85ab6928e43d","Type":"ContainerDied","Data":"2d9ea3e1573ff315bd731208fb19d3cceee712895def66ad5d74e7c3ab5292de"} Mar 12 12:33:14.910863 master-0 kubenswrapper[13984]: I0312 12:33:14.908879 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-rnnjn" Mar 12 12:33:14.917286 master-0 kubenswrapper[13984]: I0312 12:33:14.917250 13984 scope.go:117] "RemoveContainer" containerID="cfafaefe1186e2aaf0be8643dd0cbdc3cd2e42fe49f94ac73da95ad15ec95688" Mar 12 12:33:14.925423 master-0 kubenswrapper[13984]: I0312 12:33:14.925372 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="161fce36d846c7ce98305d8ed6c23827" podUID="d2a6a89fd8fe0c6f59a2124101057324" Mar 12 12:33:15.911203 master-0 kubenswrapper[13984]: I0312 12:33:15.911068 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_161fce36d846c7ce98305d8ed6c23827/kube-controller-manager-cert-syncer/0.log" Mar 12 12:33:15.987136 master-0 kubenswrapper[13984]: I0312 12:33:15.987088 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="161fce36d846c7ce98305d8ed6c23827" path="/var/lib/kubelet/pods/161fce36d846c7ce98305d8ed6c23827/volumes" Mar 12 12:33:16.274208 master-0 kubenswrapper[13984]: I0312 12:33:16.274150 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:33:16.380303 master-0 kubenswrapper[13984]: I0312 12:33:16.380264 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-var-lock\") pod \"34f6763f-16be-490a-b731-85ab6928e43d\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " Mar 12 12:33:16.380547 master-0 kubenswrapper[13984]: I0312 12:33:16.380468 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-var-lock" (OuterVolumeSpecName: "var-lock") pod "34f6763f-16be-490a-b731-85ab6928e43d" (UID: "34f6763f-16be-490a-b731-85ab6928e43d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:33:16.380658 master-0 kubenswrapper[13984]: I0312 12:33:16.380641 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34f6763f-16be-490a-b731-85ab6928e43d-kube-api-access\") pod \"34f6763f-16be-490a-b731-85ab6928e43d\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " Mar 12 12:33:16.380746 master-0 kubenswrapper[13984]: I0312 12:33:16.380734 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-kubelet-dir\") pod \"34f6763f-16be-490a-b731-85ab6928e43d\" (UID: \"34f6763f-16be-490a-b731-85ab6928e43d\") " Mar 12 12:33:16.380907 master-0 kubenswrapper[13984]: I0312 12:33:16.380854 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "34f6763f-16be-490a-b731-85ab6928e43d" (UID: "34f6763f-16be-490a-b731-85ab6928e43d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:33:16.381055 master-0 kubenswrapper[13984]: I0312 12:33:16.381040 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:16.381122 master-0 kubenswrapper[13984]: I0312 12:33:16.381110 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34f6763f-16be-490a-b731-85ab6928e43d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:16.383661 master-0 kubenswrapper[13984]: I0312 12:33:16.383615 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f6763f-16be-490a-b731-85ab6928e43d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "34f6763f-16be-490a-b731-85ab6928e43d" (UID: "34f6763f-16be-490a-b731-85ab6928e43d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:33:16.482562 master-0 kubenswrapper[13984]: I0312 12:33:16.482395 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34f6763f-16be-490a-b731-85ab6928e43d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:33:16.921055 master-0 kubenswrapper[13984]: I0312 12:33:16.920553 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"34f6763f-16be-490a-b731-85ab6928e43d","Type":"ContainerDied","Data":"bcf23b5f3c4dba53f819adc922f968010ea137f2464de2430d7930708dc6198a"} Mar 12 12:33:16.921055 master-0 kubenswrapper[13984]: I0312 12:33:16.920596 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bcf23b5f3c4dba53f819adc922f968010ea137f2464de2430d7930708dc6198a" Mar 12 12:33:16.921055 master-0 kubenswrapper[13984]: I0312 12:33:16.920679 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 12 12:33:26.979396 master-0 kubenswrapper[13984]: I0312 12:33:26.979323 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:27.002926 master-0 kubenswrapper[13984]: I0312 12:33:27.002868 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7f6e0ef8-afc8-44cc-a1be-bcf10bc934c5" Mar 12 12:33:27.002926 master-0 kubenswrapper[13984]: I0312 12:33:27.002908 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7f6e0ef8-afc8-44cc-a1be-bcf10bc934c5" Mar 12 12:33:27.031529 master-0 kubenswrapper[13984]: I0312 12:33:27.031411 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:33:27.033446 master-0 kubenswrapper[13984]: I0312 12:33:27.033403 13984 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:27.038760 master-0 kubenswrapper[13984]: I0312 12:33:27.038669 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:33:27.046550 master-0 kubenswrapper[13984]: I0312 12:33:27.046450 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:27.050297 master-0 kubenswrapper[13984]: I0312 12:33:27.050250 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:33:27.078523 master-0 kubenswrapper[13984]: W0312 12:33:27.078389 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2a6a89fd8fe0c6f59a2124101057324.slice/crio-7a795a93c95230d96f24fb9687c767dcd17a6e807544c1eefa3ba1a7fb893526 WatchSource:0}: Error finding container 7a795a93c95230d96f24fb9687c767dcd17a6e807544c1eefa3ba1a7fb893526: Status 404 returned error can't find the container with id 7a795a93c95230d96f24fb9687c767dcd17a6e807544c1eefa3ba1a7fb893526 Mar 12 12:33:28.002612 master-0 kubenswrapper[13984]: I0312 12:33:28.002216 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerStarted","Data":"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63"} Mar 12 12:33:28.002612 master-0 kubenswrapper[13984]: I0312 12:33:28.002304 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerStarted","Data":"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a"} Mar 12 12:33:28.002612 master-0 kubenswrapper[13984]: I0312 12:33:28.002350 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerStarted","Data":"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482"} Mar 12 12:33:28.002612 master-0 kubenswrapper[13984]: I0312 12:33:28.002366 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerStarted","Data":"7a795a93c95230d96f24fb9687c767dcd17a6e807544c1eefa3ba1a7fb893526"} Mar 12 12:33:28.268025 master-0 kubenswrapper[13984]: I0312 12:33:28.267992 13984 scope.go:117] "RemoveContainer" containerID="7405ab984b4b29bcfc8c9051e359b9cf1d8a4e58dd56cb0807b7d8e784ee3072" Mar 12 12:33:28.292161 master-0 kubenswrapper[13984]: I0312 12:33:28.292103 13984 scope.go:117] "RemoveContainer" containerID="a8fedfcde448bdde81261d7ac4ac60d9af5f9d81f27ef839a88514a7008192a6" Mar 12 12:33:29.015837 master-0 kubenswrapper[13984]: I0312 12:33:29.015775 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerStarted","Data":"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c"} Mar 12 12:33:37.070571 master-0 kubenswrapper[13984]: I0312 12:33:37.070329 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.070571 master-0 kubenswrapper[13984]: I0312 12:33:37.070388 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.070571 master-0 kubenswrapper[13984]: I0312 12:33:37.070400 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.070571 master-0 kubenswrapper[13984]: I0312 12:33:37.070411 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.073824 master-0 kubenswrapper[13984]: I0312 12:33:37.073791 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.077687 master-0 kubenswrapper[13984]: I0312 12:33:37.077654 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.093433 master-0 kubenswrapper[13984]: I0312 12:33:37.093039 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.093433 master-0 kubenswrapper[13984]: I0312 12:33:37.093372 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:33:37.093690 master-0 kubenswrapper[13984]: I0312 12:33:37.093470 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=10.093457717 podStartE2EDuration="10.093457717s" podCreationTimestamp="2026-03-12 12:33:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:33:29.044978584 +0000 UTC m=+541.242994086" watchObservedRunningTime="2026-03-12 12:33:37.093457717 +0000 UTC m=+549.291473209" Mar 12 12:33:37.094130 master-0 kubenswrapper[13984]: I0312 12:33:37.094101 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 12 12:33:37.094409 master-0 kubenswrapper[13984]: E0312 12:33:37.094392 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f6763f-16be-490a-b731-85ab6928e43d" containerName="installer" Mar 12 12:33:37.094409 master-0 kubenswrapper[13984]: I0312 12:33:37.094409 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f6763f-16be-490a-b731-85ab6928e43d" containerName="installer" Mar 12 12:33:37.094714 master-0 kubenswrapper[13984]: I0312 12:33:37.094694 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f6763f-16be-490a-b731-85ab6928e43d" containerName="installer" Mar 12 12:33:37.095313 master-0 kubenswrapper[13984]: I0312 12:33:37.095258 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.098358 master-0 kubenswrapper[13984]: I0312 12:33:37.098296 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-bkl67" Mar 12 12:33:37.098433 master-0 kubenswrapper[13984]: I0312 12:33:37.098322 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 12 12:33:37.114554 master-0 kubenswrapper[13984]: I0312 12:33:37.113842 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 12 12:33:37.123273 master-0 kubenswrapper[13984]: I0312 12:33:37.123202 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343917c5-143b-4807-aac0-c3aef3105186-kube-api-access\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.123273 master-0 kubenswrapper[13984]: I0312 12:33:37.123273 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.123459 master-0 kubenswrapper[13984]: I0312 12:33:37.123428 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-var-lock\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.225650 master-0 kubenswrapper[13984]: I0312 12:33:37.224753 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.225650 master-0 kubenswrapper[13984]: I0312 12:33:37.224878 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-var-lock\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.225650 master-0 kubenswrapper[13984]: I0312 12:33:37.224893 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.225650 master-0 kubenswrapper[13984]: I0312 12:33:37.224971 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-var-lock\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.225650 master-0 kubenswrapper[13984]: I0312 12:33:37.225134 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343917c5-143b-4807-aac0-c3aef3105186-kube-api-access\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.244222 master-0 kubenswrapper[13984]: I0312 12:33:37.244163 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343917c5-143b-4807-aac0-c3aef3105186-kube-api-access\") pod \"installer-2-master-0\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.432185 master-0 kubenswrapper[13984]: I0312 12:33:37.432093 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:33:37.844563 master-0 kubenswrapper[13984]: I0312 12:33:37.843174 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 12 12:33:37.851985 master-0 kubenswrapper[13984]: W0312 12:33:37.851927 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod343917c5_143b_4807_aac0_c3aef3105186.slice/crio-786b6c917163257cc98cdcd743ba9fdde876ebceb2ed624febbccb5798126329 WatchSource:0}: Error finding container 786b6c917163257cc98cdcd743ba9fdde876ebceb2ed624febbccb5798126329: Status 404 returned error can't find the container with id 786b6c917163257cc98cdcd743ba9fdde876ebceb2ed624febbccb5798126329 Mar 12 12:33:38.098065 master-0 kubenswrapper[13984]: I0312 12:33:38.097473 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"343917c5-143b-4807-aac0-c3aef3105186","Type":"ContainerStarted","Data":"786b6c917163257cc98cdcd743ba9fdde876ebceb2ed624febbccb5798126329"} Mar 12 12:33:39.111322 master-0 kubenswrapper[13984]: I0312 12:33:39.111229 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"343917c5-143b-4807-aac0-c3aef3105186","Type":"ContainerStarted","Data":"7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278"} Mar 12 12:33:39.133259 master-0 kubenswrapper[13984]: I0312 12:33:39.133181 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.133163569 podStartE2EDuration="2.133163569s" podCreationTimestamp="2026-03-12 12:33:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:33:39.13043263 +0000 UTC m=+551.328448202" watchObservedRunningTime="2026-03-12 12:33:39.133163569 +0000 UTC m=+551.331179061" Mar 12 12:33:40.930093 master-0 kubenswrapper[13984]: I0312 12:33:40.930018 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:33:40.965968 master-0 kubenswrapper[13984]: I0312 12:33:40.965901 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:33:41.177602 master-0 kubenswrapper[13984]: I0312 12:33:41.176048 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:33:41.872241 master-0 kubenswrapper[13984]: I0312 12:33:41.872192 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 12 12:33:48.314908 master-0 kubenswrapper[13984]: I0312 12:33:48.310064 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-t5sfc"] Mar 12 12:33:48.314908 master-0 kubenswrapper[13984]: I0312 12:33:48.311768 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:33:48.325502 master-0 kubenswrapper[13984]: I0312 12:33:48.319153 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6686f9695f-gkbkr"] Mar 12 12:33:48.325502 master-0 kubenswrapper[13984]: I0312 12:33:48.320033 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.351538 master-0 kubenswrapper[13984]: I0312 12:33:48.348862 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 12:33:48.351538 master-0 kubenswrapper[13984]: W0312 12:33:48.349122 13984 reflector.go:561] object-"openshift-console"/"default-dockercfg-zrwsh": failed to list *v1.Secret: secrets "default-dockercfg-zrwsh" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-console": no relationship found between node 'master-0' and this object Mar 12 12:33:48.351538 master-0 kubenswrapper[13984]: E0312 12:33:48.349158 13984 reflector.go:158] "Unhandled Error" err="object-\"openshift-console\"/\"default-dockercfg-zrwsh\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"default-dockercfg-zrwsh\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-console\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 12 12:33:48.386583 master-0 kubenswrapper[13984]: I0312 12:33:48.385881 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 12:33:48.386583 master-0 kubenswrapper[13984]: I0312 12:33:48.386156 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-zkpl5" Mar 12 12:33:48.386583 master-0 kubenswrapper[13984]: I0312 12:33:48.386367 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 12:33:48.386583 master-0 kubenswrapper[13984]: I0312 12:33:48.386450 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 12:33:48.386861 master-0 kubenswrapper[13984]: I0312 12:33:48.386779 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 12:33:48.386861 master-0 kubenswrapper[13984]: I0312 12:33:48.386851 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 12:33:48.387053 master-0 kubenswrapper[13984]: I0312 12:33:48.387024 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 12:33:48.390188 master-0 kubenswrapper[13984]: I0312 12:33:48.390142 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-556cc97fc4-pjrv5"] Mar 12 12:33:48.391361 master-0 kubenswrapper[13984]: I0312 12:33:48.391329 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.401516 master-0 kubenswrapper[13984]: I0312 12:33:48.399391 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402627 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-oauth-config\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402695 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-service-ca\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402731 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-console-config\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402778 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqczz\" (UniqueName: \"kubernetes.io/projected/de8d0297-e472-45a4-9658-1d893e4c34aa-kube-api-access-mqczz\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402817 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-serving-cert\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402849 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhmgm\" (UniqueName: \"kubernetes.io/projected/aa78efd5-b1b1-4b64-8ece-480566fadbca-kube-api-access-fhmgm\") pod \"downloads-84f57b9877-t5sfc\" (UID: \"aa78efd5-b1b1-4b64-8ece-480566fadbca\") " pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402870 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-trusted-ca-bundle\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.406666 master-0 kubenswrapper[13984]: I0312 12:33:48.402884 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-oauth-serving-cert\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.410369 master-0 kubenswrapper[13984]: I0312 12:33:48.408639 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-t5sfc"] Mar 12 12:33:48.413621 master-0 kubenswrapper[13984]: I0312 12:33:48.411234 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6686f9695f-gkbkr"] Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.422608 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.422826 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.422986 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.423123 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.423284 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.423422 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.423516 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.423580 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 12:33:48.424084 master-0 kubenswrapper[13984]: I0312 12:33:48.423649 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 12:33:48.426378 master-0 kubenswrapper[13984]: I0312 12:33:48.426321 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 12:33:48.426633 master-0 kubenswrapper[13984]: I0312 12:33:48.426610 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 12:33:48.428877 master-0 kubenswrapper[13984]: I0312 12:33:48.428848 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 12:33:48.444545 master-0 kubenswrapper[13984]: I0312 12:33:48.439292 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 12:33:48.461763 master-0 kubenswrapper[13984]: I0312 12:33:48.453278 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-556cc97fc4-pjrv5"] Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506228 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-oauth-config\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506278 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-service-ca\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506301 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-console-config\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506330 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506356 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqczz\" (UniqueName: \"kubernetes.io/projected/de8d0297-e472-45a4-9658-1d893e4c34aa-kube-api-access-mqczz\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506378 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506399 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-session\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506420 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506439 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-service-ca\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506458 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-error\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506488 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-serving-cert\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506508 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-router-certs\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506525 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2l7l\" (UniqueName: \"kubernetes.io/projected/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-kube-api-access-f2l7l\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506543 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506565 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhmgm\" (UniqueName: \"kubernetes.io/projected/aa78efd5-b1b1-4b64-8ece-480566fadbca-kube-api-access-fhmgm\") pod \"downloads-84f57b9877-t5sfc\" (UID: \"aa78efd5-b1b1-4b64-8ece-480566fadbca\") " pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506580 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506598 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-audit-dir\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506615 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-trusted-ca-bundle\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506631 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-oauth-serving-cert\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506663 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-audit-policies\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.506693 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-login\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.508861 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-oauth-serving-cert\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.509633 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-service-ca\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.510173 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-oauth-config\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.510939 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-trusted-ca-bundle\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.516432 master-0 kubenswrapper[13984]: I0312 12:33:48.514774 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-console-config\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.517777 master-0 kubenswrapper[13984]: I0312 12:33:48.517049 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-serving-cert\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.532984 master-0 kubenswrapper[13984]: I0312 12:33:48.531401 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqczz\" (UniqueName: \"kubernetes.io/projected/de8d0297-e472-45a4-9658-1d893e4c34aa-kube-api-access-mqczz\") pod \"console-6686f9695f-gkbkr\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.532984 master-0 kubenswrapper[13984]: I0312 12:33:48.532719 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhmgm\" (UniqueName: \"kubernetes.io/projected/aa78efd5-b1b1-4b64-8ece-480566fadbca-kube-api-access-fhmgm\") pod \"downloads-84f57b9877-t5sfc\" (UID: \"aa78efd5-b1b1-4b64-8ece-480566fadbca\") " pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:33:48.608521 master-0 kubenswrapper[13984]: I0312 12:33:48.608376 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608521 master-0 kubenswrapper[13984]: I0312 12:33:48.608444 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608754 master-0 kubenswrapper[13984]: I0312 12:33:48.608661 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-session\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608754 master-0 kubenswrapper[13984]: I0312 12:33:48.608719 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608817 master-0 kubenswrapper[13984]: I0312 12:33:48.608758 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-service-ca\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608817 master-0 kubenswrapper[13984]: I0312 12:33:48.608789 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-error\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608880 master-0 kubenswrapper[13984]: I0312 12:33:48.608819 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-router-certs\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608880 master-0 kubenswrapper[13984]: I0312 12:33:48.608841 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608880 master-0 kubenswrapper[13984]: I0312 12:33:48.608858 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2l7l\" (UniqueName: \"kubernetes.io/projected/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-kube-api-access-f2l7l\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608972 master-0 kubenswrapper[13984]: I0312 12:33:48.608889 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.608972 master-0 kubenswrapper[13984]: I0312 12:33:48.608916 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-audit-dir\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.609272 master-0 kubenswrapper[13984]: I0312 12:33:48.609197 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-audit-policies\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.609272 master-0 kubenswrapper[13984]: I0312 12:33:48.609256 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-audit-dir\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.609428 master-0 kubenswrapper[13984]: I0312 12:33:48.609387 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-login\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.610121 master-0 kubenswrapper[13984]: I0312 12:33:48.610094 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-audit-policies\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.610832 master-0 kubenswrapper[13984]: I0312 12:33:48.610685 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-cliconfig\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.611282 master-0 kubenswrapper[13984]: I0312 12:33:48.611253 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-service-ca\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.611835 master-0 kubenswrapper[13984]: I0312 12:33:48.611803 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.614323 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-login\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.614326 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-serving-cert\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.614361 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.614602 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-user-template-error\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.614842 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-router-certs\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.615618 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.621885 master-0 kubenswrapper[13984]: I0312 12:33:48.619753 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-v4-0-config-system-session\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.624709 master-0 kubenswrapper[13984]: I0312 12:33:48.624679 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2l7l\" (UniqueName: \"kubernetes.io/projected/38ef5c51-5afe-44c6-bfb5-5f80002f72fd-kube-api-access-f2l7l\") pod \"oauth-openshift-556cc97fc4-pjrv5\" (UID: \"38ef5c51-5afe-44c6-bfb5-5f80002f72fd\") " pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:48.767511 master-0 kubenswrapper[13984]: I0312 12:33:48.767402 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:48.788980 master-0 kubenswrapper[13984]: I0312 12:33:48.788932 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:49.199510 master-0 kubenswrapper[13984]: I0312 12:33:49.199429 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6686f9695f-gkbkr"] Mar 12 12:33:49.202866 master-0 kubenswrapper[13984]: W0312 12:33:49.202788 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde8d0297_e472_45a4_9658_1d893e4c34aa.slice/crio-57dfe76efbf7deb3a53d129eabb1be70578bcf3fbe93a2dd467072510efe925e WatchSource:0}: Error finding container 57dfe76efbf7deb3a53d129eabb1be70578bcf3fbe93a2dd467072510efe925e: Status 404 returned error can't find the container with id 57dfe76efbf7deb3a53d129eabb1be70578bcf3fbe93a2dd467072510efe925e Mar 12 12:33:49.259962 master-0 kubenswrapper[13984]: I0312 12:33:49.259911 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-zrwsh" Mar 12 12:33:49.270121 master-0 kubenswrapper[13984]: I0312 12:33:49.269924 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:33:49.315996 master-0 kubenswrapper[13984]: I0312 12:33:49.315949 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-556cc97fc4-pjrv5"] Mar 12 12:33:49.674045 master-0 kubenswrapper[13984]: W0312 12:33:49.673981 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa78efd5_b1b1_4b64_8ece_480566fadbca.slice/crio-a4b6373ad6e1f8e1c6b8ed446e8b14a9f8e012feceb40c605c77ec27996df9f2 WatchSource:0}: Error finding container a4b6373ad6e1f8e1c6b8ed446e8b14a9f8e012feceb40c605c77ec27996df9f2: Status 404 returned error can't find the container with id a4b6373ad6e1f8e1c6b8ed446e8b14a9f8e012feceb40c605c77ec27996df9f2 Mar 12 12:33:49.679427 master-0 kubenswrapper[13984]: I0312 12:33:49.679359 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-t5sfc"] Mar 12 12:33:50.202350 master-0 kubenswrapper[13984]: I0312 12:33:50.202277 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6686f9695f-gkbkr" event={"ID":"de8d0297-e472-45a4-9658-1d893e4c34aa","Type":"ContainerStarted","Data":"57dfe76efbf7deb3a53d129eabb1be70578bcf3fbe93a2dd467072510efe925e"} Mar 12 12:33:50.204732 master-0 kubenswrapper[13984]: I0312 12:33:50.204682 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-t5sfc" event={"ID":"aa78efd5-b1b1-4b64-8ece-480566fadbca","Type":"ContainerStarted","Data":"a4b6373ad6e1f8e1c6b8ed446e8b14a9f8e012feceb40c605c77ec27996df9f2"} Mar 12 12:33:50.206439 master-0 kubenswrapper[13984]: I0312 12:33:50.206367 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" event={"ID":"38ef5c51-5afe-44c6-bfb5-5f80002f72fd","Type":"ContainerStarted","Data":"e4dc08fbd413c69ffd107249b7b552ffb8aa06c61d082607441c1b98ee52263e"} Mar 12 12:33:51.873978 master-0 kubenswrapper[13984]: I0312 12:33:51.873920 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 12 12:33:51.874810 master-0 kubenswrapper[13984]: I0312 12:33:51.874166 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="343917c5-143b-4807-aac0-c3aef3105186" containerName="installer" containerID="cri-o://7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278" gracePeriod=30 Mar 12 12:33:54.236198 master-0 kubenswrapper[13984]: I0312 12:33:54.236124 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" event={"ID":"38ef5c51-5afe-44c6-bfb5-5f80002f72fd","Type":"ContainerStarted","Data":"9f19d72d23f69d69a5711b4dab6953c37601272fd41897200840cdb43fd0e580"} Mar 12 12:33:54.238836 master-0 kubenswrapper[13984]: I0312 12:33:54.238694 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6686f9695f-gkbkr" event={"ID":"de8d0297-e472-45a4-9658-1d893e4c34aa","Type":"ContainerStarted","Data":"bcd65b1e4c21449072ed272f87ee781da0da9fc877eb7847799dae3f064bf2e7"} Mar 12 12:33:54.272119 master-0 kubenswrapper[13984]: I0312 12:33:54.272036 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" podStartSLOduration=2.1987363 podStartE2EDuration="6.272016546s" podCreationTimestamp="2026-03-12 12:33:48 +0000 UTC" firstStartedPulling="2026-03-12 12:33:49.324700008 +0000 UTC m=+561.522715500" lastFinishedPulling="2026-03-12 12:33:53.397980214 +0000 UTC m=+565.595995746" observedRunningTime="2026-03-12 12:33:54.268693239 +0000 UTC m=+566.466708751" watchObservedRunningTime="2026-03-12 12:33:54.272016546 +0000 UTC m=+566.470032038" Mar 12 12:33:54.305029 master-0 kubenswrapper[13984]: I0312 12:33:54.304870 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6686f9695f-gkbkr" podStartSLOduration=2.094006458 podStartE2EDuration="6.304844213s" podCreationTimestamp="2026-03-12 12:33:48 +0000 UTC" firstStartedPulling="2026-03-12 12:33:49.205791763 +0000 UTC m=+561.403807255" lastFinishedPulling="2026-03-12 12:33:53.416629518 +0000 UTC m=+565.614645010" observedRunningTime="2026-03-12 12:33:54.30129781 +0000 UTC m=+566.499313332" watchObservedRunningTime="2026-03-12 12:33:54.304844213 +0000 UTC m=+566.502859745" Mar 12 12:33:55.248934 master-0 kubenswrapper[13984]: I0312 12:33:55.247781 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:55.252363 master-0 kubenswrapper[13984]: I0312 12:33:55.252327 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-556cc97fc4-pjrv5" Mar 12 12:33:55.283239 master-0 kubenswrapper[13984]: I0312 12:33:55.282423 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 12 12:33:55.284124 master-0 kubenswrapper[13984]: I0312 12:33:55.284086 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.310101 master-0 kubenswrapper[13984]: I0312 12:33:55.310055 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 12 12:33:55.338507 master-0 kubenswrapper[13984]: I0312 12:33:55.333050 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.338507 master-0 kubenswrapper[13984]: I0312 12:33:55.333324 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-var-lock\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.338507 master-0 kubenswrapper[13984]: I0312 12:33:55.333408 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.435541 master-0 kubenswrapper[13984]: I0312 12:33:55.435470 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-var-lock\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.435541 master-0 kubenswrapper[13984]: I0312 12:33:55.435560 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.435840 master-0 kubenswrapper[13984]: I0312 12:33:55.435816 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-var-lock\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.435993 master-0 kubenswrapper[13984]: I0312 12:33:55.435971 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.436134 master-0 kubenswrapper[13984]: I0312 12:33:55.436116 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.457545 master-0 kubenswrapper[13984]: I0312 12:33:55.453346 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kube-api-access\") pod \"installer-3-master-0\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:55.619545 master-0 kubenswrapper[13984]: I0312 12:33:55.619319 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:33:56.043868 master-0 kubenswrapper[13984]: I0312 12:33:56.043783 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 12 12:33:56.052636 master-0 kubenswrapper[13984]: W0312 12:33:56.052572 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda40de76a_de63_4e37_b1ab_b7fe767e67ea.slice/crio-23301463cb1dfd0157af2fc4f13a78e0792270fbc83c1d54a43c63dea326b9ec WatchSource:0}: Error finding container 23301463cb1dfd0157af2fc4f13a78e0792270fbc83c1d54a43c63dea326b9ec: Status 404 returned error can't find the container with id 23301463cb1dfd0157af2fc4f13a78e0792270fbc83c1d54a43c63dea326b9ec Mar 12 12:33:56.272419 master-0 kubenswrapper[13984]: I0312 12:33:56.272320 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a40de76a-de63-4e37-b1ab-b7fe767e67ea","Type":"ContainerStarted","Data":"23301463cb1dfd0157af2fc4f13a78e0792270fbc83c1d54a43c63dea326b9ec"} Mar 12 12:33:57.282866 master-0 kubenswrapper[13984]: I0312 12:33:57.282807 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a40de76a-de63-4e37-b1ab-b7fe767e67ea","Type":"ContainerStarted","Data":"1025b88b196b472d089ebe4edf860336100d1d80146a29ff9b6c08185c199ccd"} Mar 12 12:33:57.306268 master-0 kubenswrapper[13984]: I0312 12:33:57.306193 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=2.30617544 podStartE2EDuration="2.30617544s" podCreationTimestamp="2026-03-12 12:33:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:33:57.29897002 +0000 UTC m=+569.496985512" watchObservedRunningTime="2026-03-12 12:33:57.30617544 +0000 UTC m=+569.504190932" Mar 12 12:33:59.179334 master-0 kubenswrapper[13984]: I0312 12:33:59.178811 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:33:59.186006 master-0 kubenswrapper[13984]: I0312 12:33:59.185943 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:33:59.186006 master-0 kubenswrapper[13984]: I0312 12:33:59.186002 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:33:59.187372 master-0 kubenswrapper[13984]: I0312 12:33:59.187323 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:34:00.465172 master-0 kubenswrapper[13984]: I0312 12:34:00.465106 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:34:00.466026 master-0 kubenswrapper[13984]: I0312 12:34:00.465411 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="alertmanager" containerID="cri-o://7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42" gracePeriod=120 Mar 12 12:34:00.466026 master-0 kubenswrapper[13984]: I0312 12:34:00.465771 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="prom-label-proxy" containerID="cri-o://87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30" gracePeriod=120 Mar 12 12:34:00.466026 master-0 kubenswrapper[13984]: I0312 12:34:00.465831 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-metric" containerID="cri-o://9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1" gracePeriod=120 Mar 12 12:34:00.466026 master-0 kubenswrapper[13984]: I0312 12:34:00.465872 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy" containerID="cri-o://7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e" gracePeriod=120 Mar 12 12:34:00.466026 master-0 kubenswrapper[13984]: I0312 12:34:00.465910 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-web" containerID="cri-o://051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098" gracePeriod=120 Mar 12 12:34:00.466026 master-0 kubenswrapper[13984]: I0312 12:34:00.465940 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="config-reloader" containerID="cri-o://5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73" gracePeriod=120 Mar 12 12:34:01.316585 master-0 kubenswrapper[13984]: I0312 12:34:01.316532 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30" exitCode=0 Mar 12 12:34:01.316585 master-0 kubenswrapper[13984]: I0312 12:34:01.316569 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e" exitCode=0 Mar 12 12:34:01.316585 master-0 kubenswrapper[13984]: I0312 12:34:01.316580 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73" exitCode=0 Mar 12 12:34:01.316585 master-0 kubenswrapper[13984]: I0312 12:34:01.316590 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42" exitCode=0 Mar 12 12:34:01.316872 master-0 kubenswrapper[13984]: I0312 12:34:01.316618 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30"} Mar 12 12:34:01.316872 master-0 kubenswrapper[13984]: I0312 12:34:01.316680 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e"} Mar 12 12:34:01.316872 master-0 kubenswrapper[13984]: I0312 12:34:01.316696 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73"} Mar 12 12:34:01.316872 master-0 kubenswrapper[13984]: I0312 12:34:01.316709 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42"} Mar 12 12:34:01.959800 master-0 kubenswrapper[13984]: I0312 12:34:01.958889 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.100241 master-0 kubenswrapper[13984]: I0312 12:34:02.100038 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-main-db\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100241 master-0 kubenswrapper[13984]: I0312 12:34:02.100128 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100241 master-0 kubenswrapper[13984]: I0312 12:34:02.100215 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-main-tls\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100241 master-0 kubenswrapper[13984]: I0312 12:34:02.100256 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-volume\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100318 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100398 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-web\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100422 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-out\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100466 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-metrics-client-ca\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100516 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnnth\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-kube-api-access-tnnth\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100568 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-metric\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100672 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100724 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-tls-assets\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.100785 master-0 kubenswrapper[13984]: I0312 12:34:02.100769 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-web-config\") pod \"41e53fa8-31cb-44a9-9411-8ce2df26b156\" (UID: \"41e53fa8-31cb-44a9-9411-8ce2df26b156\") " Mar 12 12:34:02.101817 master-0 kubenswrapper[13984]: I0312 12:34:02.101775 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:02.102408 master-0 kubenswrapper[13984]: I0312 12:34:02.102367 13984 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.102408 master-0 kubenswrapper[13984]: I0312 12:34:02.102399 13984 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.102562 master-0 kubenswrapper[13984]: I0312 12:34:02.102400 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:02.104421 master-0 kubenswrapper[13984]: I0312 12:34:02.104385 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-out" (OuterVolumeSpecName: "config-out") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:34:02.104782 master-0 kubenswrapper[13984]: I0312 12:34:02.104703 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:02.104782 master-0 kubenswrapper[13984]: I0312 12:34:02.104730 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:02.104975 master-0 kubenswrapper[13984]: I0312 12:34:02.104935 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-volume" (OuterVolumeSpecName: "config-volume") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:02.105158 master-0 kubenswrapper[13984]: I0312 12:34:02.105122 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:02.106468 master-0 kubenswrapper[13984]: I0312 12:34:02.106426 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-kube-api-access-tnnth" (OuterVolumeSpecName: "kube-api-access-tnnth") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "kube-api-access-tnnth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:34:02.107024 master-0 kubenswrapper[13984]: I0312 12:34:02.106875 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:34:02.107289 master-0 kubenswrapper[13984]: I0312 12:34:02.107250 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:02.157531 master-0 kubenswrapper[13984]: I0312 12:34:02.157233 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-web-config" (OuterVolumeSpecName: "web-config") pod "41e53fa8-31cb-44a9-9411-8ce2df26b156" (UID: "41e53fa8-31cb-44a9-9411-8ce2df26b156"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204107 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnnth\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-kube-api-access-tnnth\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204178 13984 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204193 13984 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/41e53fa8-31cb-44a9-9411-8ce2df26b156-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204222 13984 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-web-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204234 13984 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204245 13984 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204256 13984 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.204222 master-0 kubenswrapper[13984]: I0312 12:34:02.204267 13984 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/41e53fa8-31cb-44a9-9411-8ce2df26b156-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.205335 master-0 kubenswrapper[13984]: I0312 12:34:02.204277 13984 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/41e53fa8-31cb-44a9-9411-8ce2df26b156-config-out\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.205335 master-0 kubenswrapper[13984]: I0312 12:34:02.204308 13984 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41e53fa8-31cb-44a9-9411-8ce2df26b156-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329513 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1" exitCode=0 Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329571 13984 generic.go:334] "Generic (PLEG): container finished" podID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerID="051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098" exitCode=0 Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329604 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1"} Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329645 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098"} Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329657 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"41e53fa8-31cb-44a9-9411-8ce2df26b156","Type":"ContainerDied","Data":"b17591b3bc628f3876e877072c2ef1407fe17291b50961427b132537d930b0e6"} Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329708 13984 scope.go:117] "RemoveContainer" containerID="87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30" Mar 12 12:34:02.332882 master-0 kubenswrapper[13984]: I0312 12:34:02.329968 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.353215 master-0 kubenswrapper[13984]: I0312 12:34:02.353164 13984 scope.go:117] "RemoveContainer" containerID="9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1" Mar 12 12:34:02.371974 master-0 kubenswrapper[13984]: I0312 12:34:02.371908 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:34:02.379267 master-0 kubenswrapper[13984]: I0312 12:34:02.379192 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:34:02.388250 master-0 kubenswrapper[13984]: I0312 12:34:02.388206 13984 scope.go:117] "RemoveContainer" containerID="7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e" Mar 12 12:34:02.422365 master-0 kubenswrapper[13984]: I0312 12:34:02.422276 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: E0312 12:34:02.422627 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: I0312 12:34:02.422643 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: E0312 12:34:02.422667 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="alertmanager" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: I0312 12:34:02.422673 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="alertmanager" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: E0312 12:34:02.422692 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="init-config-reloader" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: I0312 12:34:02.422698 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="init-config-reloader" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: E0312 12:34:02.422709 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="prom-label-proxy" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: I0312 12:34:02.422714 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="prom-label-proxy" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: E0312 12:34:02.422728 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-metric" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: I0312 12:34:02.422734 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-metric" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: E0312 12:34:02.422746 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="config-reloader" Mar 12 12:34:02.422747 master-0 kubenswrapper[13984]: I0312 12:34:02.422752 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="config-reloader" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: E0312 12:34:02.422766 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-web" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422772 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-web" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422888 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="prom-label-proxy" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422904 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="alertmanager" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422924 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422942 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="config-reloader" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422953 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-metric" Mar 12 12:34:02.423219 master-0 kubenswrapper[13984]: I0312 12:34:02.422963 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" containerName="kube-rbac-proxy-web" Mar 12 12:34:02.426400 master-0 kubenswrapper[13984]: I0312 12:34:02.425112 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.430614 master-0 kubenswrapper[13984]: I0312 12:34:02.430553 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 12 12:34:02.430732 master-0 kubenswrapper[13984]: I0312 12:34:02.430682 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 12 12:34:02.432000 master-0 kubenswrapper[13984]: I0312 12:34:02.430557 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 12 12:34:02.432000 master-0 kubenswrapper[13984]: I0312 12:34:02.431130 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 12 12:34:02.432000 master-0 kubenswrapper[13984]: I0312 12:34:02.431327 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 12 12:34:02.432000 master-0 kubenswrapper[13984]: I0312 12:34:02.431645 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 12 12:34:02.432000 master-0 kubenswrapper[13984]: I0312 12:34:02.431818 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 12 12:34:02.432451 master-0 kubenswrapper[13984]: I0312 12:34:02.432415 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-24x8c" Mar 12 12:34:02.441405 master-0 kubenswrapper[13984]: I0312 12:34:02.441358 13984 scope.go:117] "RemoveContainer" containerID="051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098" Mar 12 12:34:02.442078 master-0 kubenswrapper[13984]: I0312 12:34:02.441950 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 12 12:34:02.444346 master-0 kubenswrapper[13984]: I0312 12:34:02.443736 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:34:02.477052 master-0 kubenswrapper[13984]: I0312 12:34:02.476884 13984 scope.go:117] "RemoveContainer" containerID="5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73" Mar 12 12:34:02.493389 master-0 kubenswrapper[13984]: I0312 12:34:02.493340 13984 scope.go:117] "RemoveContainer" containerID="7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42" Mar 12 12:34:02.508575 master-0 kubenswrapper[13984]: I0312 12:34:02.508486 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-config-volume\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508686 master-0 kubenswrapper[13984]: I0312 12:34:02.508658 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/05e8b19e-ab9a-491a-820a-06c23194bc6d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508734 master-0 kubenswrapper[13984]: I0312 12:34:02.508695 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/05e8b19e-ab9a-491a-820a-06c23194bc6d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508776 master-0 kubenswrapper[13984]: I0312 12:34:02.508745 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508807 master-0 kubenswrapper[13984]: I0312 12:34:02.508787 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508840 master-0 kubenswrapper[13984]: I0312 12:34:02.508815 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/05e8b19e-ab9a-491a-820a-06c23194bc6d-config-out\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508877 master-0 kubenswrapper[13984]: I0312 12:34:02.508852 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05e8b19e-ab9a-491a-820a-06c23194bc6d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508911 master-0 kubenswrapper[13984]: I0312 12:34:02.508891 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-web-config\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508945 master-0 kubenswrapper[13984]: I0312 12:34:02.508911 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05e8b19e-ab9a-491a-820a-06c23194bc6d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.508945 master-0 kubenswrapper[13984]: I0312 12:34:02.508935 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l5wb\" (UniqueName: \"kubernetes.io/projected/05e8b19e-ab9a-491a-820a-06c23194bc6d-kube-api-access-2l5wb\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.509003 master-0 kubenswrapper[13984]: I0312 12:34:02.508956 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.509003 master-0 kubenswrapper[13984]: I0312 12:34:02.508986 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.509306 master-0 kubenswrapper[13984]: I0312 12:34:02.509264 13984 scope.go:117] "RemoveContainer" containerID="379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273" Mar 12 12:34:02.525402 master-0 kubenswrapper[13984]: I0312 12:34:02.525362 13984 scope.go:117] "RemoveContainer" containerID="87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30" Mar 12 12:34:02.525816 master-0 kubenswrapper[13984]: E0312 12:34:02.525776 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30\": container with ID starting with 87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30 not found: ID does not exist" containerID="87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30" Mar 12 12:34:02.525875 master-0 kubenswrapper[13984]: I0312 12:34:02.525814 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30"} err="failed to get container status \"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30\": rpc error: code = NotFound desc = could not find container \"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30\": container with ID starting with 87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30 not found: ID does not exist" Mar 12 12:34:02.525875 master-0 kubenswrapper[13984]: I0312 12:34:02.525845 13984 scope.go:117] "RemoveContainer" containerID="9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1" Mar 12 12:34:02.526063 master-0 kubenswrapper[13984]: E0312 12:34:02.526027 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1\": container with ID starting with 9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1 not found: ID does not exist" containerID="9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1" Mar 12 12:34:02.526063 master-0 kubenswrapper[13984]: I0312 12:34:02.526052 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1"} err="failed to get container status \"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1\": rpc error: code = NotFound desc = could not find container \"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1\": container with ID starting with 9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1 not found: ID does not exist" Mar 12 12:34:02.526134 master-0 kubenswrapper[13984]: I0312 12:34:02.526065 13984 scope.go:117] "RemoveContainer" containerID="7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e" Mar 12 12:34:02.526315 master-0 kubenswrapper[13984]: E0312 12:34:02.526275 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e\": container with ID starting with 7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e not found: ID does not exist" containerID="7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e" Mar 12 12:34:02.526315 master-0 kubenswrapper[13984]: I0312 12:34:02.526306 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e"} err="failed to get container status \"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e\": rpc error: code = NotFound desc = could not find container \"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e\": container with ID starting with 7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e not found: ID does not exist" Mar 12 12:34:02.526394 master-0 kubenswrapper[13984]: I0312 12:34:02.526321 13984 scope.go:117] "RemoveContainer" containerID="051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098" Mar 12 12:34:02.527038 master-0 kubenswrapper[13984]: E0312 12:34:02.526995 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098\": container with ID starting with 051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098 not found: ID does not exist" containerID="051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098" Mar 12 12:34:02.527038 master-0 kubenswrapper[13984]: I0312 12:34:02.527025 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098"} err="failed to get container status \"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098\": rpc error: code = NotFound desc = could not find container \"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098\": container with ID starting with 051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098 not found: ID does not exist" Mar 12 12:34:02.527038 master-0 kubenswrapper[13984]: I0312 12:34:02.527039 13984 scope.go:117] "RemoveContainer" containerID="5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73" Mar 12 12:34:02.527368 master-0 kubenswrapper[13984]: E0312 12:34:02.527333 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73\": container with ID starting with 5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73 not found: ID does not exist" containerID="5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73" Mar 12 12:34:02.527368 master-0 kubenswrapper[13984]: I0312 12:34:02.527358 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73"} err="failed to get container status \"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73\": rpc error: code = NotFound desc = could not find container \"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73\": container with ID starting with 5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73 not found: ID does not exist" Mar 12 12:34:02.527435 master-0 kubenswrapper[13984]: I0312 12:34:02.527371 13984 scope.go:117] "RemoveContainer" containerID="7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42" Mar 12 12:34:02.527715 master-0 kubenswrapper[13984]: E0312 12:34:02.527684 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42\": container with ID starting with 7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42 not found: ID does not exist" containerID="7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42" Mar 12 12:34:02.527777 master-0 kubenswrapper[13984]: I0312 12:34:02.527711 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42"} err="failed to get container status \"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42\": rpc error: code = NotFound desc = could not find container \"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42\": container with ID starting with 7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42 not found: ID does not exist" Mar 12 12:34:02.527777 master-0 kubenswrapper[13984]: I0312 12:34:02.527727 13984 scope.go:117] "RemoveContainer" containerID="379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273" Mar 12 12:34:02.528065 master-0 kubenswrapper[13984]: E0312 12:34:02.528031 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273\": container with ID starting with 379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273 not found: ID does not exist" containerID="379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273" Mar 12 12:34:02.528065 master-0 kubenswrapper[13984]: I0312 12:34:02.528060 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273"} err="failed to get container status \"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273\": rpc error: code = NotFound desc = could not find container \"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273\": container with ID starting with 379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273 not found: ID does not exist" Mar 12 12:34:02.528150 master-0 kubenswrapper[13984]: I0312 12:34:02.528075 13984 scope.go:117] "RemoveContainer" containerID="87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30" Mar 12 12:34:02.528299 master-0 kubenswrapper[13984]: I0312 12:34:02.528269 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30"} err="failed to get container status \"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30\": rpc error: code = NotFound desc = could not find container \"87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30\": container with ID starting with 87f65270ebdd6b1f57b44f85f7de0593222a10ac5406f1010291e19e62234d30 not found: ID does not exist" Mar 12 12:34:02.528299 master-0 kubenswrapper[13984]: I0312 12:34:02.528288 13984 scope.go:117] "RemoveContainer" containerID="9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1" Mar 12 12:34:02.528599 master-0 kubenswrapper[13984]: I0312 12:34:02.528575 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1"} err="failed to get container status \"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1\": rpc error: code = NotFound desc = could not find container \"9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1\": container with ID starting with 9510aeb58389a99f87a7b3c3c3ae044c3793fff3f6f4383e5b72bee67858dbf1 not found: ID does not exist" Mar 12 12:34:02.528599 master-0 kubenswrapper[13984]: I0312 12:34:02.528595 13984 scope.go:117] "RemoveContainer" containerID="7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e" Mar 12 12:34:02.528821 master-0 kubenswrapper[13984]: I0312 12:34:02.528797 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e"} err="failed to get container status \"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e\": rpc error: code = NotFound desc = could not find container \"7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e\": container with ID starting with 7706ffd7f7d5dd453118891c3cd15061715d077e261dff2bb7ef369cc8bf728e not found: ID does not exist" Mar 12 12:34:02.528821 master-0 kubenswrapper[13984]: I0312 12:34:02.528815 13984 scope.go:117] "RemoveContainer" containerID="051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098" Mar 12 12:34:02.529054 master-0 kubenswrapper[13984]: I0312 12:34:02.529030 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098"} err="failed to get container status \"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098\": rpc error: code = NotFound desc = could not find container \"051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098\": container with ID starting with 051add7f06d2a5a90bb210e60dba86ef5a7cc9c5ddc051eef3ff1e14b8f78098 not found: ID does not exist" Mar 12 12:34:02.529054 master-0 kubenswrapper[13984]: I0312 12:34:02.529048 13984 scope.go:117] "RemoveContainer" containerID="5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73" Mar 12 12:34:02.529453 master-0 kubenswrapper[13984]: I0312 12:34:02.529400 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73"} err="failed to get container status \"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73\": rpc error: code = NotFound desc = could not find container \"5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73\": container with ID starting with 5a8d1408838ee99ee5f4b32f1f0a7119af618f22aed535db5f53ee0291907c73 not found: ID does not exist" Mar 12 12:34:02.529453 master-0 kubenswrapper[13984]: I0312 12:34:02.529423 13984 scope.go:117] "RemoveContainer" containerID="7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42" Mar 12 12:34:02.529888 master-0 kubenswrapper[13984]: I0312 12:34:02.529854 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42"} err="failed to get container status \"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42\": rpc error: code = NotFound desc = could not find container \"7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42\": container with ID starting with 7583e7b2197403a4a55a3569fefeb768612960d28526c64da65a6548269ada42 not found: ID does not exist" Mar 12 12:34:02.529888 master-0 kubenswrapper[13984]: I0312 12:34:02.529884 13984 scope.go:117] "RemoveContainer" containerID="379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273" Mar 12 12:34:02.530112 master-0 kubenswrapper[13984]: I0312 12:34:02.530086 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273"} err="failed to get container status \"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273\": rpc error: code = NotFound desc = could not find container \"379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273\": container with ID starting with 379b01f94623f06237d16e2472f9c01b16e102bbf573199d6b8b26bea4c61273 not found: ID does not exist" Mar 12 12:34:02.611122 master-0 kubenswrapper[13984]: I0312 12:34:02.611018 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.611122 master-0 kubenswrapper[13984]: I0312 12:34:02.611126 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.611396 master-0 kubenswrapper[13984]: I0312 12:34:02.611355 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/05e8b19e-ab9a-491a-820a-06c23194bc6d-config-out\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.611596 master-0 kubenswrapper[13984]: I0312 12:34:02.611561 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05e8b19e-ab9a-491a-820a-06c23194bc6d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.611795 master-0 kubenswrapper[13984]: I0312 12:34:02.611769 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-web-config\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.611843 master-0 kubenswrapper[13984]: I0312 12:34:02.611797 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05e8b19e-ab9a-491a-820a-06c23194bc6d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.611843 master-0 kubenswrapper[13984]: I0312 12:34:02.611830 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l5wb\" (UniqueName: \"kubernetes.io/projected/05e8b19e-ab9a-491a-820a-06c23194bc6d-kube-api-access-2l5wb\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.612375 master-0 kubenswrapper[13984]: I0312 12:34:02.612325 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.612478 master-0 kubenswrapper[13984]: I0312 12:34:02.612421 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.612725 master-0 kubenswrapper[13984]: I0312 12:34:02.612682 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-config-volume\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.612786 master-0 kubenswrapper[13984]: I0312 12:34:02.612747 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/05e8b19e-ab9a-491a-820a-06c23194bc6d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.612786 master-0 kubenswrapper[13984]: I0312 12:34:02.612722 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/05e8b19e-ab9a-491a-820a-06c23194bc6d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.612963 master-0 kubenswrapper[13984]: I0312 12:34:02.612936 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/05e8b19e-ab9a-491a-820a-06c23194bc6d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.613071 master-0 kubenswrapper[13984]: I0312 12:34:02.613033 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/05e8b19e-ab9a-491a-820a-06c23194bc6d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.613323 master-0 kubenswrapper[13984]: I0312 12:34:02.613294 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/05e8b19e-ab9a-491a-820a-06c23194bc6d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.615782 master-0 kubenswrapper[13984]: I0312 12:34:02.615727 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/05e8b19e-ab9a-491a-820a-06c23194bc6d-config-out\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.615872 master-0 kubenswrapper[13984]: I0312 12:34:02.615751 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.616291 master-0 kubenswrapper[13984]: I0312 12:34:02.616241 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.616291 master-0 kubenswrapper[13984]: I0312 12:34:02.616254 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-web-config\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.616918 master-0 kubenswrapper[13984]: I0312 12:34:02.616878 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.617419 master-0 kubenswrapper[13984]: I0312 12:34:02.617370 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-config-volume\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.617472 master-0 kubenswrapper[13984]: I0312 12:34:02.617434 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/05e8b19e-ab9a-491a-820a-06c23194bc6d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.625436 master-0 kubenswrapper[13984]: I0312 12:34:02.625345 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/05e8b19e-ab9a-491a-820a-06c23194bc6d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.628722 master-0 kubenswrapper[13984]: I0312 12:34:02.628670 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l5wb\" (UniqueName: \"kubernetes.io/projected/05e8b19e-ab9a-491a-820a-06c23194bc6d-kube-api-access-2l5wb\") pod \"alertmanager-main-0\" (UID: \"05e8b19e-ab9a-491a-820a-06c23194bc6d\") " pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:02.758173 master-0 kubenswrapper[13984]: I0312 12:34:02.758079 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 12 12:34:03.227410 master-0 kubenswrapper[13984]: I0312 12:34:03.227316 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 12 12:34:03.233614 master-0 kubenswrapper[13984]: W0312 12:34:03.233550 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e8b19e_ab9a_491a_820a_06c23194bc6d.slice/crio-854a61ad0e7404a0a7812a449332cc5e224815d4ad19de273a22fb56d3712cdf WatchSource:0}: Error finding container 854a61ad0e7404a0a7812a449332cc5e224815d4ad19de273a22fb56d3712cdf: Status 404 returned error can't find the container with id 854a61ad0e7404a0a7812a449332cc5e224815d4ad19de273a22fb56d3712cdf Mar 12 12:34:03.357808 master-0 kubenswrapper[13984]: I0312 12:34:03.357731 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"854a61ad0e7404a0a7812a449332cc5e224815d4ad19de273a22fb56d3712cdf"} Mar 12 12:34:03.992644 master-0 kubenswrapper[13984]: I0312 12:34:03.990782 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41e53fa8-31cb-44a9-9411-8ce2df26b156" path="/var/lib/kubelet/pods/41e53fa8-31cb-44a9-9411-8ce2df26b156/volumes" Mar 12 12:34:04.334542 master-0 kubenswrapper[13984]: I0312 12:34:04.334394 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:34:04.335330 master-0 kubenswrapper[13984]: I0312 12:34:04.335284 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="prometheus" containerID="cri-o://b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" gracePeriod=600 Mar 12 12:34:04.335969 master-0 kubenswrapper[13984]: I0312 12:34:04.335827 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-thanos" containerID="cri-o://589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" gracePeriod=600 Mar 12 12:34:04.335969 master-0 kubenswrapper[13984]: I0312 12:34:04.335905 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy" containerID="cri-o://494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" gracePeriod=600 Mar 12 12:34:04.335969 master-0 kubenswrapper[13984]: I0312 12:34:04.335951 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-web" containerID="cri-o://1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" gracePeriod=600 Mar 12 12:34:04.338659 master-0 kubenswrapper[13984]: I0312 12:34:04.335992 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="thanos-sidecar" containerID="cri-o://fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" gracePeriod=600 Mar 12 12:34:04.338659 master-0 kubenswrapper[13984]: I0312 12:34:04.336039 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="config-reloader" containerID="cri-o://099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" gracePeriod=600 Mar 12 12:34:04.384003 master-0 kubenswrapper[13984]: I0312 12:34:04.383953 13984 generic.go:334] "Generic (PLEG): container finished" podID="05e8b19e-ab9a-491a-820a-06c23194bc6d" containerID="a68b94e2a27c3fe9e7c9af8294d9a5c05a6663701714a2a2f7127376d47d8054" exitCode=0 Mar 12 12:34:04.384003 master-0 kubenswrapper[13984]: I0312 12:34:04.383996 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerDied","Data":"a68b94e2a27c3fe9e7c9af8294d9a5c05a6663701714a2a2f7127376d47d8054"} Mar 12 12:34:04.952527 master-0 kubenswrapper[13984]: I0312 12:34:04.952470 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.053108 master-0 kubenswrapper[13984]: I0312 12:34:05.053071 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-metrics-client-ca\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053230 master-0 kubenswrapper[13984]: I0312 12:34:05.053130 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053230 master-0 kubenswrapper[13984]: I0312 12:34:05.053156 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-serving-certs-ca-bundle\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053230 master-0 kubenswrapper[13984]: I0312 12:34:05.053200 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-kube-rbac-proxy\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053382 master-0 kubenswrapper[13984]: I0312 12:34:05.053240 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053382 master-0 kubenswrapper[13984]: I0312 12:34:05.053267 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-grpc-tls\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053382 master-0 kubenswrapper[13984]: I0312 12:34:05.053287 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-kubelet-serving-ca-bundle\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053382 master-0 kubenswrapper[13984]: I0312 12:34:05.053323 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-config\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053382 master-0 kubenswrapper[13984]: I0312 12:34:05.053352 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-metrics-client-certs\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053382 master-0 kubenswrapper[13984]: I0312 12:34:05.053377 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-tls\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053427 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-rulefiles-0\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053451 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053506 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-thanos-prometheus-http-client-file\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053531 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-db\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053571 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-tls-assets\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053598 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95hq7\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-kube-api-access-95hq7\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053615 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-web-config\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.053658 master-0 kubenswrapper[13984]: I0312 12:34:05.053647 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-config-out\") pod \"d30590a1-9d92-4347-84ae-fffc821e6a57\" (UID: \"d30590a1-9d92-4347-84ae-fffc821e6a57\") " Mar 12 12:34:05.055604 master-0 kubenswrapper[13984]: I0312 12:34:05.055569 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:05.055693 master-0 kubenswrapper[13984]: I0312 12:34:05.055560 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:05.058944 master-0 kubenswrapper[13984]: I0312 12:34:05.058846 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:34:05.059138 master-0 kubenswrapper[13984]: I0312 12:34:05.059066 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:05.059675 master-0 kubenswrapper[13984]: I0312 12:34:05.059600 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:05.059869 master-0 kubenswrapper[13984]: I0312 12:34:05.059832 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:34:05.060718 master-0 kubenswrapper[13984]: I0312 12:34:05.060679 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.060820 master-0 kubenswrapper[13984]: I0312 12:34:05.060792 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.060913 master-0 kubenswrapper[13984]: I0312 12:34:05.060878 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.061682 master-0 kubenswrapper[13984]: I0312 12:34:05.061650 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.061868 master-0 kubenswrapper[13984]: I0312 12:34:05.061845 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:34:05.062536 master-0 kubenswrapper[13984]: I0312 12:34:05.062505 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-kube-api-access-95hq7" (OuterVolumeSpecName: "kube-api-access-95hq7") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "kube-api-access-95hq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:34:05.064162 master-0 kubenswrapper[13984]: I0312 12:34:05.064109 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-config" (OuterVolumeSpecName: "config") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.064563 master-0 kubenswrapper[13984]: I0312 12:34:05.064510 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.064721 master-0 kubenswrapper[13984]: I0312 12:34:05.064680 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.064896 master-0 kubenswrapper[13984]: I0312 12:34:05.064831 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.065508 master-0 kubenswrapper[13984]: I0312 12:34:05.065436 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-config-out" (OuterVolumeSpecName: "config-out") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:34:05.135228 master-0 kubenswrapper[13984]: I0312 12:34:05.135189 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-web-config" (OuterVolumeSpecName: "web-config") pod "d30590a1-9d92-4347-84ae-fffc821e6a57" (UID: "d30590a1-9d92-4347-84ae-fffc821e6a57"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:34:05.156077 master-0 kubenswrapper[13984]: I0312 12:34:05.156019 13984 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156077 master-0 kubenswrapper[13984]: I0312 12:34:05.156071 13984 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156085 13984 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156096 13984 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156111 13984 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156120 13984 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156129 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156141 13984 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156150 13984 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156159 13984 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156170 13984 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156178 13984 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156187 13984 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156198 13984 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156207 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95hq7\" (UniqueName: \"kubernetes.io/projected/d30590a1-9d92-4347-84ae-fffc821e6a57-kube-api-access-95hq7\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156217 13984 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d30590a1-9d92-4347-84ae-fffc821e6a57-web-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156224 13984 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d30590a1-9d92-4347-84ae-fffc821e6a57-config-out\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.156269 master-0 kubenswrapper[13984]: I0312 12:34:05.156232 13984 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d30590a1-9d92-4347-84ae-fffc821e6a57-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:05.396160 master-0 kubenswrapper[13984]: I0312 12:34:05.396098 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"0d1cbd789b570982a7e5866c2abd563b966311827c4efa8a495298709baf0e81"} Mar 12 12:34:05.396639 master-0 kubenswrapper[13984]: I0312 12:34:05.396180 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"20a4135288568cb737aece0162777de177679c980ea97230b448cbe3ae8f69c7"} Mar 12 12:34:05.396639 master-0 kubenswrapper[13984]: I0312 12:34:05.396200 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"3dba9b997339fb6315e8b1f3dab24490c8f9981444f250cd4f700d8f97f08993"} Mar 12 12:34:05.396639 master-0 kubenswrapper[13984]: I0312 12:34:05.396216 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"3212f50ae272161a35cdc9b622d3be00ac7e3bce07a20e662429d1f1e310e87b"} Mar 12 12:34:05.396639 master-0 kubenswrapper[13984]: I0312 12:34:05.396231 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"f725e403ae6c5a2033e56fe382458a201b081650969fabf3488fa8ca897f9c85"} Mar 12 12:34:05.396639 master-0 kubenswrapper[13984]: I0312 12:34:05.396245 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"05e8b19e-ab9a-491a-820a-06c23194bc6d","Type":"ContainerStarted","Data":"06b3b5b5bc0e59a3081cf02a38d63466fcda619bf42ff0ab12839501a804dd43"} Mar 12 12:34:05.404606 master-0 kubenswrapper[13984]: I0312 12:34:05.404547 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" exitCode=0 Mar 12 12:34:05.404606 master-0 kubenswrapper[13984]: I0312 12:34:05.404598 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" exitCode=0 Mar 12 12:34:05.404723 master-0 kubenswrapper[13984]: I0312 12:34:05.404614 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" exitCode=0 Mar 12 12:34:05.404723 master-0 kubenswrapper[13984]: I0312 12:34:05.404626 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" exitCode=0 Mar 12 12:34:05.404723 master-0 kubenswrapper[13984]: I0312 12:34:05.404635 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" exitCode=0 Mar 12 12:34:05.404723 master-0 kubenswrapper[13984]: I0312 12:34:05.404645 13984 generic.go:334] "Generic (PLEG): container finished" podID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" exitCode=0 Mar 12 12:34:05.404723 master-0 kubenswrapper[13984]: I0312 12:34:05.404665 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.405458 master-0 kubenswrapper[13984]: I0312 12:34:05.404674 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} Mar 12 12:34:05.405530 master-0 kubenswrapper[13984]: I0312 12:34:05.405501 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} Mar 12 12:34:05.405577 master-0 kubenswrapper[13984]: I0312 12:34:05.405531 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} Mar 12 12:34:05.405577 master-0 kubenswrapper[13984]: I0312 12:34:05.405544 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} Mar 12 12:34:05.405577 master-0 kubenswrapper[13984]: I0312 12:34:05.405557 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} Mar 12 12:34:05.405689 master-0 kubenswrapper[13984]: I0312 12:34:05.405588 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} Mar 12 12:34:05.405689 master-0 kubenswrapper[13984]: I0312 12:34:05.405602 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"d30590a1-9d92-4347-84ae-fffc821e6a57","Type":"ContainerDied","Data":"06ee77275245407cd8c409a3f8c9468f1a5cbeabd28a8c022ac8ea47bcc72739"} Mar 12 12:34:05.405689 master-0 kubenswrapper[13984]: I0312 12:34:05.405625 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.433086 master-0 kubenswrapper[13984]: I0312 12:34:05.433034 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.441222 master-0 kubenswrapper[13984]: I0312 12:34:05.437139 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=3.437114827 podStartE2EDuration="3.437114827s" podCreationTimestamp="2026-03-12 12:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:34:05.431891545 +0000 UTC m=+577.629907047" watchObservedRunningTime="2026-03-12 12:34:05.437114827 +0000 UTC m=+577.635130319" Mar 12 12:34:05.471693 master-0 kubenswrapper[13984]: I0312 12:34:05.471636 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.477526 master-0 kubenswrapper[13984]: I0312 12:34:05.477457 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:34:05.480121 master-0 kubenswrapper[13984]: I0312 12:34:05.479717 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:34:05.499338 master-0 kubenswrapper[13984]: I0312 12:34:05.498214 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.527115 master-0 kubenswrapper[13984]: I0312 12:34:05.527012 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527614 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527662 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527688 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-web" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527697 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-web" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527718 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-thanos" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527727 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-thanos" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527779 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="init-config-reloader" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527789 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="init-config-reloader" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527800 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="thanos-sidecar" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527808 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="thanos-sidecar" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527825 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="prometheus" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527833 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="prometheus" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: E0312 12:34:05.527853 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="config-reloader" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.527860 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="config-reloader" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.528032 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="config-reloader" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.528055 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-web" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.528067 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="thanos-sidecar" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.528081 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="prometheus" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.528093 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy" Mar 12 12:34:05.528558 master-0 kubenswrapper[13984]: I0312 12:34:05.528153 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" containerName="kube-rbac-proxy-thanos" Mar 12 12:34:05.533341 master-0 kubenswrapper[13984]: I0312 12:34:05.533301 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.547723 master-0 kubenswrapper[13984]: I0312 12:34:05.544480 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.547723 master-0 kubenswrapper[13984]: I0312 12:34:05.547309 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-29i444egqnkse" Mar 12 12:34:05.547723 master-0 kubenswrapper[13984]: I0312 12:34:05.547565 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 12 12:34:05.549963 master-0 kubenswrapper[13984]: I0312 12:34:05.549938 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-7vmf8" Mar 12 12:34:05.550121 master-0 kubenswrapper[13984]: I0312 12:34:05.550083 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 12 12:34:05.550517 master-0 kubenswrapper[13984]: I0312 12:34:05.550458 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 12 12:34:05.550634 master-0 kubenswrapper[13984]: I0312 12:34:05.550567 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 12 12:34:05.550634 master-0 kubenswrapper[13984]: I0312 12:34:05.550593 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 12 12:34:05.550757 master-0 kubenswrapper[13984]: I0312 12:34:05.550705 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 12 12:34:05.551062 master-0 kubenswrapper[13984]: I0312 12:34:05.551022 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 12 12:34:05.551440 master-0 kubenswrapper[13984]: I0312 12:34:05.551398 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:34:05.552860 master-0 kubenswrapper[13984]: I0312 12:34:05.552818 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 12 12:34:05.560693 master-0 kubenswrapper[13984]: I0312 12:34:05.560649 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 12 12:34:05.561722 master-0 kubenswrapper[13984]: I0312 12:34:05.561658 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 12 12:34:05.563900 master-0 kubenswrapper[13984]: I0312 12:34:05.563854 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 12 12:34:05.568583 master-0 kubenswrapper[13984]: I0312 12:34:05.568545 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.587799 master-0 kubenswrapper[13984]: I0312 12:34:05.587726 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.603349 master-0 kubenswrapper[13984]: I0312 12:34:05.603182 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.603687 master-0 kubenswrapper[13984]: E0312 12:34:05.603643 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.603761 master-0 kubenswrapper[13984]: I0312 12:34:05.603686 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} err="failed to get container status \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" Mar 12 12:34:05.603761 master-0 kubenswrapper[13984]: I0312 12:34:05.603711 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.604175 master-0 kubenswrapper[13984]: E0312 12:34:05.604135 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.604237 master-0 kubenswrapper[13984]: I0312 12:34:05.604166 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} err="failed to get container status \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" Mar 12 12:34:05.604292 master-0 kubenswrapper[13984]: I0312 12:34:05.604200 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.604590 master-0 kubenswrapper[13984]: E0312 12:34:05.604548 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.604590 master-0 kubenswrapper[13984]: I0312 12:34:05.604583 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} err="failed to get container status \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" Mar 12 12:34:05.604700 master-0 kubenswrapper[13984]: I0312 12:34:05.604599 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.604935 master-0 kubenswrapper[13984]: E0312 12:34:05.604895 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.604987 master-0 kubenswrapper[13984]: I0312 12:34:05.604928 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} err="failed to get container status \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" Mar 12 12:34:05.604987 master-0 kubenswrapper[13984]: I0312 12:34:05.604946 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.605367 master-0 kubenswrapper[13984]: E0312 12:34:05.605326 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.605421 master-0 kubenswrapper[13984]: I0312 12:34:05.605360 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} err="failed to get container status \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" Mar 12 12:34:05.605421 master-0 kubenswrapper[13984]: I0312 12:34:05.605380 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.605918 master-0 kubenswrapper[13984]: E0312 12:34:05.605799 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.605985 master-0 kubenswrapper[13984]: I0312 12:34:05.605963 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} err="failed to get container status \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" Mar 12 12:34:05.606026 master-0 kubenswrapper[13984]: I0312 12:34:05.605995 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.606320 master-0 kubenswrapper[13984]: E0312 12:34:05.606281 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.606320 master-0 kubenswrapper[13984]: I0312 12:34:05.606311 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} err="failed to get container status \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" Mar 12 12:34:05.606400 master-0 kubenswrapper[13984]: I0312 12:34:05.606329 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.606615 master-0 kubenswrapper[13984]: I0312 12:34:05.606580 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} err="failed to get container status \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" Mar 12 12:34:05.606615 master-0 kubenswrapper[13984]: I0312 12:34:05.606611 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.606922 master-0 kubenswrapper[13984]: I0312 12:34:05.606886 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} err="failed to get container status \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" Mar 12 12:34:05.606922 master-0 kubenswrapper[13984]: I0312 12:34:05.606912 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.607326 master-0 kubenswrapper[13984]: I0312 12:34:05.607280 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} err="failed to get container status \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" Mar 12 12:34:05.607326 master-0 kubenswrapper[13984]: I0312 12:34:05.607314 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.607691 master-0 kubenswrapper[13984]: I0312 12:34:05.607633 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} err="failed to get container status \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" Mar 12 12:34:05.607691 master-0 kubenswrapper[13984]: I0312 12:34:05.607687 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.608376 master-0 kubenswrapper[13984]: I0312 12:34:05.607990 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} err="failed to get container status \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" Mar 12 12:34:05.608376 master-0 kubenswrapper[13984]: I0312 12:34:05.608026 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.608376 master-0 kubenswrapper[13984]: I0312 12:34:05.608310 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} err="failed to get container status \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" Mar 12 12:34:05.608376 master-0 kubenswrapper[13984]: I0312 12:34:05.608332 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.608699 master-0 kubenswrapper[13984]: I0312 12:34:05.608670 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} err="failed to get container status \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" Mar 12 12:34:05.608699 master-0 kubenswrapper[13984]: I0312 12:34:05.608690 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.609012 master-0 kubenswrapper[13984]: I0312 12:34:05.608973 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} err="failed to get container status \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" Mar 12 12:34:05.609012 master-0 kubenswrapper[13984]: I0312 12:34:05.609000 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.609240 master-0 kubenswrapper[13984]: I0312 12:34:05.609200 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} err="failed to get container status \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" Mar 12 12:34:05.609240 master-0 kubenswrapper[13984]: I0312 12:34:05.609225 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.609519 master-0 kubenswrapper[13984]: I0312 12:34:05.609434 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} err="failed to get container status \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" Mar 12 12:34:05.609636 master-0 kubenswrapper[13984]: I0312 12:34:05.609608 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.609945 master-0 kubenswrapper[13984]: I0312 12:34:05.609914 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} err="failed to get container status \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" Mar 12 12:34:05.609945 master-0 kubenswrapper[13984]: I0312 12:34:05.609943 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.610272 master-0 kubenswrapper[13984]: I0312 12:34:05.610233 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} err="failed to get container status \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" Mar 12 12:34:05.610272 master-0 kubenswrapper[13984]: I0312 12:34:05.610260 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.610552 master-0 kubenswrapper[13984]: I0312 12:34:05.610521 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} err="failed to get container status \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" Mar 12 12:34:05.610603 master-0 kubenswrapper[13984]: I0312 12:34:05.610550 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.610927 master-0 kubenswrapper[13984]: I0312 12:34:05.610886 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} err="failed to get container status \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" Mar 12 12:34:05.610927 master-0 kubenswrapper[13984]: I0312 12:34:05.610916 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.611257 master-0 kubenswrapper[13984]: I0312 12:34:05.611209 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} err="failed to get container status \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" Mar 12 12:34:05.611257 master-0 kubenswrapper[13984]: I0312 12:34:05.611245 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.611530 master-0 kubenswrapper[13984]: I0312 12:34:05.611489 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} err="failed to get container status \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" Mar 12 12:34:05.611530 master-0 kubenswrapper[13984]: I0312 12:34:05.611519 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.611824 master-0 kubenswrapper[13984]: I0312 12:34:05.611782 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} err="failed to get container status \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" Mar 12 12:34:05.611824 master-0 kubenswrapper[13984]: I0312 12:34:05.611812 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.612199 master-0 kubenswrapper[13984]: I0312 12:34:05.612160 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} err="failed to get container status \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" Mar 12 12:34:05.612199 master-0 kubenswrapper[13984]: I0312 12:34:05.612187 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.612524 master-0 kubenswrapper[13984]: I0312 12:34:05.612464 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} err="failed to get container status \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" Mar 12 12:34:05.612524 master-0 kubenswrapper[13984]: I0312 12:34:05.612516 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.612854 master-0 kubenswrapper[13984]: I0312 12:34:05.612813 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} err="failed to get container status \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" Mar 12 12:34:05.612854 master-0 kubenswrapper[13984]: I0312 12:34:05.612841 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.613155 master-0 kubenswrapper[13984]: I0312 12:34:05.613114 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} err="failed to get container status \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" Mar 12 12:34:05.613155 master-0 kubenswrapper[13984]: I0312 12:34:05.613144 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.613430 master-0 kubenswrapper[13984]: I0312 12:34:05.613388 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} err="failed to get container status \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" Mar 12 12:34:05.613430 master-0 kubenswrapper[13984]: I0312 12:34:05.613420 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.613727 master-0 kubenswrapper[13984]: I0312 12:34:05.613672 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} err="failed to get container status \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" Mar 12 12:34:05.613727 master-0 kubenswrapper[13984]: I0312 12:34:05.613718 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.613988 master-0 kubenswrapper[13984]: I0312 12:34:05.613960 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} err="failed to get container status \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" Mar 12 12:34:05.613988 master-0 kubenswrapper[13984]: I0312 12:34:05.613987 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.614382 master-0 kubenswrapper[13984]: I0312 12:34:05.614340 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} err="failed to get container status \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" Mar 12 12:34:05.614382 master-0 kubenswrapper[13984]: I0312 12:34:05.614369 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.614812 master-0 kubenswrapper[13984]: I0312 12:34:05.614772 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} err="failed to get container status \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" Mar 12 12:34:05.614812 master-0 kubenswrapper[13984]: I0312 12:34:05.614801 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.615063 master-0 kubenswrapper[13984]: I0312 12:34:05.615023 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} err="failed to get container status \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" Mar 12 12:34:05.615063 master-0 kubenswrapper[13984]: I0312 12:34:05.615048 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.615280 master-0 kubenswrapper[13984]: I0312 12:34:05.615241 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} err="failed to get container status \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" Mar 12 12:34:05.615280 master-0 kubenswrapper[13984]: I0312 12:34:05.615273 13984 scope.go:117] "RemoveContainer" containerID="589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd" Mar 12 12:34:05.615848 master-0 kubenswrapper[13984]: I0312 12:34:05.615808 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd"} err="failed to get container status \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": rpc error: code = NotFound desc = could not find container \"589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd\": container with ID starting with 589a4cbf1f7e9ff294c94009e75262cf79157cbcaf650dc70ad892740a56fdcd not found: ID does not exist" Mar 12 12:34:05.615848 master-0 kubenswrapper[13984]: I0312 12:34:05.615837 13984 scope.go:117] "RemoveContainer" containerID="494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b" Mar 12 12:34:05.616087 master-0 kubenswrapper[13984]: I0312 12:34:05.616050 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b"} err="failed to get container status \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": rpc error: code = NotFound desc = could not find container \"494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b\": container with ID starting with 494124dae938dc689cabb1cfb572f0713e8aa91ed501a44a73ba503d9fe0580b not found: ID does not exist" Mar 12 12:34:05.616087 master-0 kubenswrapper[13984]: I0312 12:34:05.616074 13984 scope.go:117] "RemoveContainer" containerID="1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead" Mar 12 12:34:05.616548 master-0 kubenswrapper[13984]: I0312 12:34:05.616469 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead"} err="failed to get container status \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": rpc error: code = NotFound desc = could not find container \"1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead\": container with ID starting with 1627bf68545575566b71a5c8d1d54033012beb0a1cb81d99c14390769026bead not found: ID does not exist" Mar 12 12:34:05.616548 master-0 kubenswrapper[13984]: I0312 12:34:05.616546 13984 scope.go:117] "RemoveContainer" containerID="fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22" Mar 12 12:34:05.616926 master-0 kubenswrapper[13984]: I0312 12:34:05.616865 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22"} err="failed to get container status \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": rpc error: code = NotFound desc = could not find container \"fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22\": container with ID starting with fad7bec3b07db5273c7a54797cf76bb9cb2b09e888b7dc3a7f0b3f2870873b22 not found: ID does not exist" Mar 12 12:34:05.616926 master-0 kubenswrapper[13984]: I0312 12:34:05.616896 13984 scope.go:117] "RemoveContainer" containerID="099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa" Mar 12 12:34:05.617106 master-0 kubenswrapper[13984]: I0312 12:34:05.617075 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa"} err="failed to get container status \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": rpc error: code = NotFound desc = could not find container \"099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa\": container with ID starting with 099762e92379ebf733f13e8bfea53f751671f3dd4ea8c9210902c54bb238efaa not found: ID does not exist" Mar 12 12:34:05.617106 master-0 kubenswrapper[13984]: I0312 12:34:05.617097 13984 scope.go:117] "RemoveContainer" containerID="b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74" Mar 12 12:34:05.617316 master-0 kubenswrapper[13984]: I0312 12:34:05.617274 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74"} err="failed to get container status \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": rpc error: code = NotFound desc = could not find container \"b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74\": container with ID starting with b16de4cf6dd3607b0c077ec58c8afb319f84db73ccc8de8f83af8813b998bd74 not found: ID does not exist" Mar 12 12:34:05.617316 master-0 kubenswrapper[13984]: I0312 12:34:05.617305 13984 scope.go:117] "RemoveContainer" containerID="bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e" Mar 12 12:34:05.617773 master-0 kubenswrapper[13984]: I0312 12:34:05.617712 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e"} err="failed to get container status \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": rpc error: code = NotFound desc = could not find container \"bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e\": container with ID starting with bdbfdd7921c2dd69ce493518e23eaaff21db6a60d50077f894b40d2f1bff693e not found: ID does not exist" Mar 12 12:34:05.666361 master-0 kubenswrapper[13984]: I0312 12:34:05.666160 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666361 master-0 kubenswrapper[13984]: I0312 12:34:05.666284 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666604 master-0 kubenswrapper[13984]: I0312 12:34:05.666441 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkgjt\" (UniqueName: \"kubernetes.io/projected/e5e27787-de34-48dd-8854-79387c59fa6c-kube-api-access-wkgjt\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666604 master-0 kubenswrapper[13984]: I0312 12:34:05.666535 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5e27787-de34-48dd-8854-79387c59fa6c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666604 master-0 kubenswrapper[13984]: I0312 12:34:05.666579 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666692 master-0 kubenswrapper[13984]: I0312 12:34:05.666609 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666742 master-0 kubenswrapper[13984]: I0312 12:34:05.666718 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666787 master-0 kubenswrapper[13984]: I0312 12:34:05.666759 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-config\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666883 master-0 kubenswrapper[13984]: I0312 12:34:05.666802 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666943 master-0 kubenswrapper[13984]: I0312 12:34:05.666889 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.666943 master-0 kubenswrapper[13984]: I0312 12:34:05.666928 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667011 master-0 kubenswrapper[13984]: I0312 12:34:05.666960 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667011 master-0 kubenswrapper[13984]: I0312 12:34:05.666997 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667068 master-0 kubenswrapper[13984]: I0312 12:34:05.667033 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667157 master-0 kubenswrapper[13984]: I0312 12:34:05.667129 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667191 master-0 kubenswrapper[13984]: I0312 12:34:05.667163 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5e27787-de34-48dd-8854-79387c59fa6c-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667225 master-0 kubenswrapper[13984]: I0312 12:34:05.667188 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.667278 master-0 kubenswrapper[13984]: I0312 12:34:05.667257 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.768807 master-0 kubenswrapper[13984]: I0312 12:34:05.768743 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.768807 master-0 kubenswrapper[13984]: I0312 12:34:05.768800 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.768838 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkgjt\" (UniqueName: \"kubernetes.io/projected/e5e27787-de34-48dd-8854-79387c59fa6c-kube-api-access-wkgjt\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.768862 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5e27787-de34-48dd-8854-79387c59fa6c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.768886 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.768908 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.768951 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.768980 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-config\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.769002 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769053 master-0 kubenswrapper[13984]: I0312 12:34:05.769036 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769063 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769089 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769115 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769137 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769195 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769219 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5e27787-de34-48dd-8854-79387c59fa6c-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769271 master-0 kubenswrapper[13984]: I0312 12:34:05.769245 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.769605 master-0 kubenswrapper[13984]: I0312 12:34:05.769285 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.770274 master-0 kubenswrapper[13984]: I0312 12:34:05.770238 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.771174 master-0 kubenswrapper[13984]: I0312 12:34:05.771126 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.774134 master-0 kubenswrapper[13984]: I0312 12:34:05.774096 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.774645 master-0 kubenswrapper[13984]: I0312 12:34:05.774619 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.774967 master-0 kubenswrapper[13984]: I0312 12:34:05.774943 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.775410 master-0 kubenswrapper[13984]: I0312 12:34:05.775386 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.775458 master-0 kubenswrapper[13984]: I0312 12:34:05.775447 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.775615 master-0 kubenswrapper[13984]: I0312 12:34:05.775582 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.775800 master-0 kubenswrapper[13984]: I0312 12:34:05.775772 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.776055 master-0 kubenswrapper[13984]: I0312 12:34:05.776024 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.777365 master-0 kubenswrapper[13984]: I0312 12:34:05.777331 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e5e27787-de34-48dd-8854-79387c59fa6c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.777880 master-0 kubenswrapper[13984]: I0312 12:34:05.777840 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.778190 master-0 kubenswrapper[13984]: I0312 12:34:05.778157 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-config\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.780876 master-0 kubenswrapper[13984]: I0312 12:34:05.780841 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-web-config\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.781073 master-0 kubenswrapper[13984]: I0312 12:34:05.781045 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.785074 master-0 kubenswrapper[13984]: I0312 12:34:05.783943 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e5e27787-de34-48dd-8854-79387c59fa6c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.786386 master-0 kubenswrapper[13984]: I0312 12:34:05.786336 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e5e27787-de34-48dd-8854-79387c59fa6c-config-out\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.786509 master-0 kubenswrapper[13984]: I0312 12:34:05.786468 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkgjt\" (UniqueName: \"kubernetes.io/projected/e5e27787-de34-48dd-8854-79387c59fa6c-kube-api-access-wkgjt\") pod \"prometheus-k8s-0\" (UID: \"e5e27787-de34-48dd-8854-79387c59fa6c\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.870765 master-0 kubenswrapper[13984]: I0312 12:34:05.870692 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:05.994037 master-0 kubenswrapper[13984]: I0312 12:34:05.993977 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30590a1-9d92-4347-84ae-fffc821e6a57" path="/var/lib/kubelet/pods/d30590a1-9d92-4347-84ae-fffc821e6a57/volumes" Mar 12 12:34:06.309882 master-0 kubenswrapper[13984]: I0312 12:34:06.309625 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 12 12:34:06.316146 master-0 kubenswrapper[13984]: W0312 12:34:06.316107 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5e27787_de34_48dd_8854_79387c59fa6c.slice/crio-3787318f432a229fedd404a9636484348c394f285858bd01130479692439ee04 WatchSource:0}: Error finding container 3787318f432a229fedd404a9636484348c394f285858bd01130479692439ee04: Status 404 returned error can't find the container with id 3787318f432a229fedd404a9636484348c394f285858bd01130479692439ee04 Mar 12 12:34:06.414417 master-0 kubenswrapper[13984]: I0312 12:34:06.414359 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"3787318f432a229fedd404a9636484348c394f285858bd01130479692439ee04"} Mar 12 12:34:07.287538 master-0 kubenswrapper[13984]: I0312 12:34:07.286264 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79db4dd985-zqfm7"] Mar 12 12:34:07.289635 master-0 kubenswrapper[13984]: I0312 12:34:07.289607 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.308615 master-0 kubenswrapper[13984]: I0312 12:34:07.306157 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79db4dd985-zqfm7"] Mar 12 12:34:07.423607 master-0 kubenswrapper[13984]: I0312 12:34:07.423523 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-config\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.424446 master-0 kubenswrapper[13984]: I0312 12:34:07.423630 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-oauth-serving-cert\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.424446 master-0 kubenswrapper[13984]: I0312 12:34:07.423711 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-serving-cert\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.424446 master-0 kubenswrapper[13984]: I0312 12:34:07.423792 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjf7c\" (UniqueName: \"kubernetes.io/projected/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-kube-api-access-vjf7c\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.424446 master-0 kubenswrapper[13984]: I0312 12:34:07.423834 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-service-ca\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.424446 master-0 kubenswrapper[13984]: I0312 12:34:07.423857 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-trusted-ca-bundle\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.424446 master-0 kubenswrapper[13984]: I0312 12:34:07.423877 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-oauth-config\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.426651 master-0 kubenswrapper[13984]: I0312 12:34:07.426591 13984 generic.go:334] "Generic (PLEG): container finished" podID="e5e27787-de34-48dd-8854-79387c59fa6c" containerID="5e6b207022d01f1edcb87b0ad4d511a238ef8e018d194c34656d5205676e009b" exitCode=0 Mar 12 12:34:07.426726 master-0 kubenswrapper[13984]: I0312 12:34:07.426656 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerDied","Data":"5e6b207022d01f1edcb87b0ad4d511a238ef8e018d194c34656d5205676e009b"} Mar 12 12:34:07.526142 master-0 kubenswrapper[13984]: I0312 12:34:07.526065 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjf7c\" (UniqueName: \"kubernetes.io/projected/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-kube-api-access-vjf7c\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.526294 master-0 kubenswrapper[13984]: I0312 12:34:07.526224 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-service-ca\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.526759 master-0 kubenswrapper[13984]: I0312 12:34:07.526724 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-trusted-ca-bundle\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.526866 master-0 kubenswrapper[13984]: I0312 12:34:07.526834 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-oauth-config\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.527156 master-0 kubenswrapper[13984]: I0312 12:34:07.527128 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-config\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.527258 master-0 kubenswrapper[13984]: I0312 12:34:07.527236 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-oauth-serving-cert\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.527376 master-0 kubenswrapper[13984]: I0312 12:34:07.527340 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-serving-cert\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.527906 master-0 kubenswrapper[13984]: I0312 12:34:07.527873 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-service-ca\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.527971 master-0 kubenswrapper[13984]: I0312 12:34:07.527936 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-trusted-ca-bundle\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.528329 master-0 kubenswrapper[13984]: I0312 12:34:07.528298 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-oauth-serving-cert\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.528604 master-0 kubenswrapper[13984]: I0312 12:34:07.528579 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-config\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.531599 master-0 kubenswrapper[13984]: I0312 12:34:07.531577 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-oauth-config\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.532602 master-0 kubenswrapper[13984]: I0312 12:34:07.532547 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-serving-cert\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.547969 master-0 kubenswrapper[13984]: I0312 12:34:07.547859 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjf7c\" (UniqueName: \"kubernetes.io/projected/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-kube-api-access-vjf7c\") pod \"console-79db4dd985-zqfm7\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:07.614515 master-0 kubenswrapper[13984]: I0312 12:34:07.614443 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:08.097773 master-0 kubenswrapper[13984]: I0312 12:34:08.097103 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79db4dd985-zqfm7"] Mar 12 12:34:08.107700 master-0 kubenswrapper[13984]: W0312 12:34:08.107655 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0485a15a_a4a4_4df7_a74c_d7379f4d01cb.slice/crio-25efc1af59a3bbaf7ecf95082004d95cdc92d21a5c4ab5b3028b6330b943e192 WatchSource:0}: Error finding container 25efc1af59a3bbaf7ecf95082004d95cdc92d21a5c4ab5b3028b6330b943e192: Status 404 returned error can't find the container with id 25efc1af59a3bbaf7ecf95082004d95cdc92d21a5c4ab5b3028b6330b943e192 Mar 12 12:34:08.440515 master-0 kubenswrapper[13984]: I0312 12:34:08.440067 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"eb640f5568fc734f805348416fec2b4f8ffaf8a7d933e57f0835997aded72698"} Mar 12 12:34:08.440515 master-0 kubenswrapper[13984]: I0312 12:34:08.440117 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"841cde39a0ec388f41ec877b4534bac80a993c6ce90eb789975f4fcd02f4dc65"} Mar 12 12:34:08.440515 master-0 kubenswrapper[13984]: I0312 12:34:08.440127 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"82edfc7324b6e383c2da409bfeab97f0935d317a485e8794e4005528b899330a"} Mar 12 12:34:08.440515 master-0 kubenswrapper[13984]: I0312 12:34:08.440137 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"bd3d634fc3faa1a9cb11a784f969df5607f41b3e411f2f62116fe8c9c04a6eed"} Mar 12 12:34:08.440515 master-0 kubenswrapper[13984]: I0312 12:34:08.440145 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"2541ef7d057d6c3b3b6566d8e7d37921bfaa70d02bc9a456f7f66cd8a4767d04"} Mar 12 12:34:08.446522 master-0 kubenswrapper[13984]: I0312 12:34:08.444807 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79db4dd985-zqfm7" event={"ID":"0485a15a-a4a4-4df7-a74c-d7379f4d01cb","Type":"ContainerStarted","Data":"4cad2d72886f2435dbe577ee337f88f9af220f6e90117513eb2892126e93c01f"} Mar 12 12:34:08.446522 master-0 kubenswrapper[13984]: I0312 12:34:08.444853 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79db4dd985-zqfm7" event={"ID":"0485a15a-a4a4-4df7-a74c-d7379f4d01cb","Type":"ContainerStarted","Data":"25efc1af59a3bbaf7ecf95082004d95cdc92d21a5c4ab5b3028b6330b943e192"} Mar 12 12:34:08.468453 master-0 kubenswrapper[13984]: I0312 12:34:08.466525 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79db4dd985-zqfm7" podStartSLOduration=1.46647186 podStartE2EDuration="1.46647186s" podCreationTimestamp="2026-03-12 12:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:34:08.466251244 +0000 UTC m=+580.664266756" watchObservedRunningTime="2026-03-12 12:34:08.46647186 +0000 UTC m=+580.664487352" Mar 12 12:34:08.770149 master-0 kubenswrapper[13984]: I0312 12:34:08.770069 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:34:08.770412 master-0 kubenswrapper[13984]: I0312 12:34:08.770155 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:34:09.443853 master-0 kubenswrapper[13984]: I0312 12:34:09.443795 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_343917c5-143b-4807-aac0-c3aef3105186/installer/0.log" Mar 12 12:34:09.444947 master-0 kubenswrapper[13984]: I0312 12:34:09.443880 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:34:09.454914 master-0 kubenswrapper[13984]: I0312 12:34:09.454846 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_343917c5-143b-4807-aac0-c3aef3105186/installer/0.log" Mar 12 12:34:09.454914 master-0 kubenswrapper[13984]: I0312 12:34:09.454906 13984 generic.go:334] "Generic (PLEG): container finished" podID="343917c5-143b-4807-aac0-c3aef3105186" containerID="7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278" exitCode=1 Mar 12 12:34:09.455225 master-0 kubenswrapper[13984]: I0312 12:34:09.454965 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"343917c5-143b-4807-aac0-c3aef3105186","Type":"ContainerDied","Data":"7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278"} Mar 12 12:34:09.455225 master-0 kubenswrapper[13984]: I0312 12:34:09.454993 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"343917c5-143b-4807-aac0-c3aef3105186","Type":"ContainerDied","Data":"786b6c917163257cc98cdcd743ba9fdde876ebceb2ed624febbccb5798126329"} Mar 12 12:34:09.455225 master-0 kubenswrapper[13984]: I0312 12:34:09.455010 13984 scope.go:117] "RemoveContainer" containerID="7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278" Mar 12 12:34:09.455556 master-0 kubenswrapper[13984]: I0312 12:34:09.454973 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 12 12:34:09.479635 master-0 kubenswrapper[13984]: I0312 12:34:09.479537 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e5e27787-de34-48dd-8854-79387c59fa6c","Type":"ContainerStarted","Data":"f55331869d2f3114f76fcaf07ebb977fee915d0db6421b25fe68df6561f96c37"} Mar 12 12:34:09.497731 master-0 kubenswrapper[13984]: I0312 12:34:09.496714 13984 scope.go:117] "RemoveContainer" containerID="7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278" Mar 12 12:34:09.497731 master-0 kubenswrapper[13984]: E0312 12:34:09.497048 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278\": container with ID starting with 7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278 not found: ID does not exist" containerID="7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278" Mar 12 12:34:09.497731 master-0 kubenswrapper[13984]: I0312 12:34:09.497111 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278"} err="failed to get container status \"7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278\": rpc error: code = NotFound desc = could not find container \"7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278\": container with ID starting with 7da0d6519c02be6cedc813fd71e5bbe404b98057ae4649964373248a8f510278 not found: ID does not exist" Mar 12 12:34:09.525758 master-0 kubenswrapper[13984]: I0312 12:34:09.525686 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=4.525666598 podStartE2EDuration="4.525666598s" podCreationTimestamp="2026-03-12 12:34:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:34:09.523278009 +0000 UTC m=+581.721293511" watchObservedRunningTime="2026-03-12 12:34:09.525666598 +0000 UTC m=+581.723682090" Mar 12 12:34:09.561973 master-0 kubenswrapper[13984]: I0312 12:34:09.561921 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-var-lock\") pod \"343917c5-143b-4807-aac0-c3aef3105186\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " Mar 12 12:34:09.562180 master-0 kubenswrapper[13984]: I0312 12:34:09.562016 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343917c5-143b-4807-aac0-c3aef3105186-kube-api-access\") pod \"343917c5-143b-4807-aac0-c3aef3105186\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " Mar 12 12:34:09.562180 master-0 kubenswrapper[13984]: I0312 12:34:09.562142 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-kubelet-dir\") pod \"343917c5-143b-4807-aac0-c3aef3105186\" (UID: \"343917c5-143b-4807-aac0-c3aef3105186\") " Mar 12 12:34:09.562372 master-0 kubenswrapper[13984]: I0312 12:34:09.562352 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-var-lock" (OuterVolumeSpecName: "var-lock") pod "343917c5-143b-4807-aac0-c3aef3105186" (UID: "343917c5-143b-4807-aac0-c3aef3105186"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:34:09.562564 master-0 kubenswrapper[13984]: I0312 12:34:09.562470 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "343917c5-143b-4807-aac0-c3aef3105186" (UID: "343917c5-143b-4807-aac0-c3aef3105186"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:34:09.563621 master-0 kubenswrapper[13984]: I0312 12:34:09.563591 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:09.563621 master-0 kubenswrapper[13984]: I0312 12:34:09.563613 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/343917c5-143b-4807-aac0-c3aef3105186-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:09.565375 master-0 kubenswrapper[13984]: I0312 12:34:09.565313 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/343917c5-143b-4807-aac0-c3aef3105186-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "343917c5-143b-4807-aac0-c3aef3105186" (UID: "343917c5-143b-4807-aac0-c3aef3105186"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:34:09.664692 master-0 kubenswrapper[13984]: I0312 12:34:09.664579 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/343917c5-143b-4807-aac0-c3aef3105186-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:09.793721 master-0 kubenswrapper[13984]: I0312 12:34:09.793664 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 12 12:34:09.802373 master-0 kubenswrapper[13984]: I0312 12:34:09.801895 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 12 12:34:09.992188 master-0 kubenswrapper[13984]: I0312 12:34:09.992124 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="343917c5-143b-4807-aac0-c3aef3105186" path="/var/lib/kubelet/pods/343917c5-143b-4807-aac0-c3aef3105186/volumes" Mar 12 12:34:10.871947 master-0 kubenswrapper[13984]: I0312 12:34:10.871844 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:34:17.615549 master-0 kubenswrapper[13984]: I0312 12:34:17.615197 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:17.615549 master-0 kubenswrapper[13984]: I0312 12:34:17.615290 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:34:17.621940 master-0 kubenswrapper[13984]: I0312 12:34:17.617471 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:34:17.621940 master-0 kubenswrapper[13984]: I0312 12:34:17.617598 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:34:18.768313 master-0 kubenswrapper[13984]: I0312 12:34:18.768251 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:34:18.768840 master-0 kubenswrapper[13984]: I0312 12:34:18.768317 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:34:25.618853 master-0 kubenswrapper[13984]: I0312 12:34:25.618750 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-t5sfc" event={"ID":"aa78efd5-b1b1-4b64-8ece-480566fadbca","Type":"ContainerStarted","Data":"a3aafd0d2cc9ac9828b1ec397f9f30f208f1a88b025b81cfe912f8ae9b52cdbc"} Mar 12 12:34:25.620008 master-0 kubenswrapper[13984]: I0312 12:34:25.619839 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:34:25.622726 master-0 kubenswrapper[13984]: I0312 12:34:25.622689 13984 patch_prober.go:28] interesting pod/downloads-84f57b9877-t5sfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.88:8080/\": dial tcp 10.128.0.88:8080: connect: connection refused" start-of-body= Mar 12 12:34:25.622899 master-0 kubenswrapper[13984]: I0312 12:34:25.622736 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-t5sfc" podUID="aa78efd5-b1b1-4b64-8ece-480566fadbca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.88:8080/\": dial tcp 10.128.0.88:8080: connect: connection refused" Mar 12 12:34:25.821713 master-0 kubenswrapper[13984]: I0312 12:34:25.821640 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-t5sfc" podStartSLOduration=2.9980170900000003 podStartE2EDuration="37.821623141s" podCreationTimestamp="2026-03-12 12:33:48 +0000 UTC" firstStartedPulling="2026-03-12 12:33:49.676117339 +0000 UTC m=+561.874132831" lastFinishedPulling="2026-03-12 12:34:24.49972337 +0000 UTC m=+596.697738882" observedRunningTime="2026-03-12 12:34:25.817114884 +0000 UTC m=+598.015130426" watchObservedRunningTime="2026-03-12 12:34:25.821623141 +0000 UTC m=+598.019638633" Mar 12 12:34:26.646082 master-0 kubenswrapper[13984]: I0312 12:34:26.644819 13984 patch_prober.go:28] interesting pod/downloads-84f57b9877-t5sfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.88:8080/\": dial tcp 10.128.0.88:8080: connect: connection refused" start-of-body= Mar 12 12:34:26.646082 master-0 kubenswrapper[13984]: I0312 12:34:26.644910 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-t5sfc" podUID="aa78efd5-b1b1-4b64-8ece-480566fadbca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.88:8080/\": dial tcp 10.128.0.88:8080: connect: connection refused" Mar 12 12:34:27.616198 master-0 kubenswrapper[13984]: I0312 12:34:27.616034 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:34:27.616198 master-0 kubenswrapper[13984]: I0312 12:34:27.616117 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:34:27.651620 master-0 kubenswrapper[13984]: I0312 12:34:27.651505 13984 patch_prober.go:28] interesting pod/downloads-84f57b9877-t5sfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.88:8080/\": dial tcp 10.128.0.88:8080: connect: connection refused" start-of-body= Mar 12 12:34:27.651620 master-0 kubenswrapper[13984]: I0312 12:34:27.651580 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-t5sfc" podUID="aa78efd5-b1b1-4b64-8ece-480566fadbca" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.88:8080/\": dial tcp 10.128.0.88:8080: connect: connection refused" Mar 12 12:34:28.376258 master-0 kubenswrapper[13984]: I0312 12:34:28.376183 13984 scope.go:117] "RemoveContainer" containerID="765bad2b5a42ad204d381d1e7b6c234c53c95f09dcfc3c8f5a96103e042780ef" Mar 12 12:34:28.395959 master-0 kubenswrapper[13984]: I0312 12:34:28.395900 13984 scope.go:117] "RemoveContainer" containerID="0d50ef84a902d60b2d7d410974b24e4f48e1a63818f16207d507864a9e96ea0a" Mar 12 12:34:28.769349 master-0 kubenswrapper[13984]: I0312 12:34:28.769274 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:34:28.770072 master-0 kubenswrapper[13984]: I0312 12:34:28.769360 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:34:29.294535 master-0 kubenswrapper[13984]: I0312 12:34:29.294463 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-t5sfc" Mar 12 12:34:37.617838 master-0 kubenswrapper[13984]: I0312 12:34:37.617750 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:34:37.618782 master-0 kubenswrapper[13984]: I0312 12:34:37.618239 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:34:38.769132 master-0 kubenswrapper[13984]: I0312 12:34:38.769054 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:34:38.769132 master-0 kubenswrapper[13984]: I0312 12:34:38.769125 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:34:47.616611 master-0 kubenswrapper[13984]: I0312 12:34:47.616520 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:34:47.617319 master-0 kubenswrapper[13984]: I0312 12:34:47.616641 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:34:48.769389 master-0 kubenswrapper[13984]: I0312 12:34:48.769245 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:34:48.769389 master-0 kubenswrapper[13984]: I0312 12:34:48.769356 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:34:53.398616 master-0 kubenswrapper[13984]: I0312 12:34:53.398529 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:34:53.403162 master-0 kubenswrapper[13984]: I0312 12:34:53.403125 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 12 12:34:53.602057 master-0 kubenswrapper[13984]: I0312 12:34:53.601985 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") pod \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\" (UID: \"48e7be9a-921a-42b0-b9ae-b7ffd28c89a4\") " Mar 12 12:34:53.606077 master-0 kubenswrapper[13984]: I0312 12:34:53.606018 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4" (UID: "48e7be9a-921a-42b0-b9ae-b7ffd28c89a4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:34:53.704290 master-0 kubenswrapper[13984]: I0312 12:34:53.704202 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48e7be9a-921a-42b0-b9ae-b7ffd28c89a4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:34:54.294150 master-0 kubenswrapper[13984]: I0312 12:34:54.294069 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 12 12:34:54.294561 master-0 kubenswrapper[13984]: I0312 12:34:54.294499 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://545ce6883090e4ecdcea7622975d3e1147cda32524b819153cd3b1df99e6ad16" gracePeriod=15 Mar 12 12:34:54.294662 master-0 kubenswrapper[13984]: I0312 12:34:54.294552 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://dbc67a0df6d005a6dc51cf6a3278cefd084341f566051f53aeef112699601de2" gracePeriod=15 Mar 12 12:34:54.294662 master-0 kubenswrapper[13984]: I0312 12:34:54.294588 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" containerID="cri-o://a3e42264f81889f53565ef287ba1a41d67422c625a1e231bc34f310555991c94" gracePeriod=15 Mar 12 12:34:54.294805 master-0 kubenswrapper[13984]: I0312 12:34:54.294622 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f6fe46052f0fc88bb12a846f99f359af9fcb5cab5db347f3b65949fb3346c632" gracePeriod=15 Mar 12 12:34:54.294805 master-0 kubenswrapper[13984]: I0312 12:34:54.294464 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" containerID="cri-o://700614ee74998a2f22b9ab087ce7bf59a848008e38ccd5dd9ed6bc2ca56e75b9" gracePeriod=15 Mar 12 12:34:54.303866 master-0 kubenswrapper[13984]: I0312 12:34:54.303783 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: E0312 12:34:54.304273 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: I0312 12:34:54.304301 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: E0312 12:34:54.304322 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: I0312 12:34:54.304335 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: E0312 12:34:54.304358 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: I0312 12:34:54.304370 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: E0312 12:34:54.304391 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: I0312 12:34:54.304403 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: E0312 12:34:54.304420 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: I0312 12:34:54.304432 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: E0312 12:34:54.304461 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="343917c5-143b-4807-aac0-c3aef3105186" containerName="installer" Mar 12 12:34:54.304515 master-0 kubenswrapper[13984]: I0312 12:34:54.304474 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="343917c5-143b-4807-aac0-c3aef3105186" containerName="installer" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: E0312 12:34:54.304541 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304553 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304746 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304773 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304799 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304818 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304873 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304912 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="343917c5-143b-4807-aac0-c3aef3105186" containerName="installer" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.304941 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: E0312 12:34:54.305238 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 12:34:54.305332 master-0 kubenswrapper[13984]: I0312 12:34:54.305254 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 12 12:34:54.308336 master-0 kubenswrapper[13984]: I0312 12:34:54.308007 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:34:54.309109 master-0 kubenswrapper[13984]: I0312 12:34:54.309070 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.313150 master-0 kubenswrapper[13984]: I0312 12:34:54.313084 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" podUID="077dd10388b9e3e48a07382126e86621" Mar 12 12:34:54.315869 master-0 kubenswrapper[13984]: I0312 12:34:54.315804 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.315977 master-0 kubenswrapper[13984]: I0312 12:34:54.315924 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.316067 master-0 kubenswrapper[13984]: I0312 12:34:54.315994 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.316137 master-0 kubenswrapper[13984]: I0312 12:34:54.316064 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.316137 master-0 kubenswrapper[13984]: I0312 12:34:54.316107 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.316331 master-0 kubenswrapper[13984]: I0312 12:34:54.316286 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.316404 master-0 kubenswrapper[13984]: I0312 12:34:54.316354 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.316466 master-0 kubenswrapper[13984]: I0312 12:34:54.316427 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418324 master-0 kubenswrapper[13984]: I0312 12:34:54.418264 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.418724 master-0 kubenswrapper[13984]: I0312 12:34:54.418361 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418724 master-0 kubenswrapper[13984]: I0312 12:34:54.418399 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418724 master-0 kubenswrapper[13984]: I0312 12:34:54.418461 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.418724 master-0 kubenswrapper[13984]: I0312 12:34:54.418532 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.418724 master-0 kubenswrapper[13984]: I0312 12:34:54.418582 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418724 master-0 kubenswrapper[13984]: I0312 12:34:54.418694 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418902 master-0 kubenswrapper[13984]: I0312 12:34:54.418763 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418935 master-0 kubenswrapper[13984]: I0312 12:34:54.418900 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.418986 master-0 kubenswrapper[13984]: I0312 12:34:54.418960 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.419036 master-0 kubenswrapper[13984]: I0312 12:34:54.419014 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.419086 master-0 kubenswrapper[13984]: I0312 12:34:54.419065 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.419132 master-0 kubenswrapper[13984]: I0312 12:34:54.419111 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.419166 master-0 kubenswrapper[13984]: I0312 12:34:54.419156 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:34:54.419208 master-0 kubenswrapper[13984]: I0312 12:34:54.419197 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.419254 master-0 kubenswrapper[13984]: I0312 12:34:54.419237 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:34:54.922810 master-0 kubenswrapper[13984]: I0312 12:34:54.922758 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 12 12:34:54.923639 master-0 kubenswrapper[13984]: I0312 12:34:54.923597 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="545ce6883090e4ecdcea7622975d3e1147cda32524b819153cd3b1df99e6ad16" exitCode=0 Mar 12 12:34:54.923639 master-0 kubenswrapper[13984]: I0312 12:34:54.923627 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="a3e42264f81889f53565ef287ba1a41d67422c625a1e231bc34f310555991c94" exitCode=0 Mar 12 12:34:54.923639 master-0 kubenswrapper[13984]: I0312 12:34:54.923635 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="dbc67a0df6d005a6dc51cf6a3278cefd084341f566051f53aeef112699601de2" exitCode=0 Mar 12 12:34:54.923639 master-0 kubenswrapper[13984]: I0312 12:34:54.923643 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="f6fe46052f0fc88bb12a846f99f359af9fcb5cab5db347f3b65949fb3346c632" exitCode=2 Mar 12 12:34:54.923799 master-0 kubenswrapper[13984]: I0312 12:34:54.923714 13984 scope.go:117] "RemoveContainer" containerID="531a537f715644fc0623e243f05c5d1fb97e1cf2fd31e77e02a07259ef5d606f" Mar 12 12:34:55.934869 master-0 kubenswrapper[13984]: I0312 12:34:55.934818 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 12 12:34:57.616896 master-0 kubenswrapper[13984]: I0312 12:34:57.616766 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:34:57.616896 master-0 kubenswrapper[13984]: I0312 12:34:57.616876 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:34:58.768736 master-0 kubenswrapper[13984]: I0312 12:34:58.768669 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:34:58.769339 master-0 kubenswrapper[13984]: I0312 12:34:58.768777 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:35:00.356952 master-0 kubenswrapper[13984]: E0312 12:35:00.352063 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:35:00.356952 master-0 kubenswrapper[13984]: I0312 12:35:00.352715 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:35:00.402049 master-0 kubenswrapper[13984]: W0312 12:35:00.400432 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899242a15b2bdf3b4a04fb323647ca94.slice/crio-de851fbc9810d703102feaf502594838a6e3fbcab18de78435f5160b928867f5 WatchSource:0}: Error finding container de851fbc9810d703102feaf502594838a6e3fbcab18de78435f5160b928867f5: Status 404 returned error can't find the container with id de851fbc9810d703102feaf502594838a6e3fbcab18de78435f5160b928867f5 Mar 12 12:35:00.749857 master-0 kubenswrapper[13984]: E0312 12:35:00.749304 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.760504 master-0 kubenswrapper[13984]: E0312 12:35:00.754407 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.760504 master-0 kubenswrapper[13984]: E0312 12:35:00.759981 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.768397 master-0 kubenswrapper[13984]: E0312 12:35:00.762926 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.768397 master-0 kubenswrapper[13984]: E0312 12:35:00.763430 13984 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.768397 master-0 kubenswrapper[13984]: I0312 12:35:00.763453 13984 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 12 12:35:00.768397 master-0 kubenswrapper[13984]: E0312 12:35:00.764674 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 12 12:35:00.966412 master-0 kubenswrapper[13984]: E0312 12:35:00.966362 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 12 12:35:00.985404 master-0 kubenswrapper[13984]: I0312 12:35:00.982894 13984 generic.go:334] "Generic (PLEG): container finished" podID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" containerID="1025b88b196b472d089ebe4edf860336100d1d80146a29ff9b6c08185c199ccd" exitCode=0 Mar 12 12:35:00.985404 master-0 kubenswrapper[13984]: I0312 12:35:00.982961 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a40de76a-de63-4e37-b1ab-b7fe767e67ea","Type":"ContainerDied","Data":"1025b88b196b472d089ebe4edf860336100d1d80146a29ff9b6c08185c199ccd"} Mar 12 12:35:00.985404 master-0 kubenswrapper[13984]: I0312 12:35:00.984325 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.985404 master-0 kubenswrapper[13984]: I0312 12:35:00.985160 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"b6f1cba7d9b8571df4e1cbef3aba679da425e257dfae8755a672b28c421b9453"} Mar 12 12:35:00.985404 master-0 kubenswrapper[13984]: I0312 12:35:00.985196 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"de851fbc9810d703102feaf502594838a6e3fbcab18de78435f5160b928867f5"} Mar 12 12:35:00.986029 master-0 kubenswrapper[13984]: E0312 12:35:00.986002 13984 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:35:00.986105 master-0 kubenswrapper[13984]: I0312 12:35:00.986046 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:00.989214 master-0 kubenswrapper[13984]: I0312 12:35:00.989197 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 12 12:35:00.989806 master-0 kubenswrapper[13984]: I0312 12:35:00.989727 13984 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="700614ee74998a2f22b9ab087ce7bf59a848008e38ccd5dd9ed6bc2ca56e75b9" exitCode=0 Mar 12 12:35:01.367863 master-0 kubenswrapper[13984]: E0312 12:35:01.367793 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 12 12:35:02.016718 master-0 kubenswrapper[13984]: I0312 12:35:02.016683 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 12 12:35:02.019961 master-0 kubenswrapper[13984]: I0312 12:35:02.017246 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648ff05a8c05edb0ff4fb29aac79f95d1724f9709816a1a74090562fd93e69df" Mar 12 12:35:02.049549 master-0 kubenswrapper[13984]: I0312 12:35:02.044861 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 12 12:35:02.049549 master-0 kubenswrapper[13984]: I0312 12:35:02.046092 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:02.049549 master-0 kubenswrapper[13984]: I0312 12:35:02.048593 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:02.049549 master-0 kubenswrapper[13984]: I0312 12:35:02.049421 13984 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.082610 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.082796 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.082789 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.082899 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.083116 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.083228 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.083539 13984 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.083556 13984 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:02.085658 master-0 kubenswrapper[13984]: I0312 12:35:02.083564 13984 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:02.170382 master-0 kubenswrapper[13984]: E0312 12:35:02.170306 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 12 12:35:02.439931 master-0 kubenswrapper[13984]: I0312 12:35:02.439859 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:35:02.441558 master-0 kubenswrapper[13984]: I0312 12:35:02.441510 13984 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:02.442183 master-0 kubenswrapper[13984]: I0312 12:35:02.442139 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:02.511180 master-0 kubenswrapper[13984]: I0312 12:35:02.511092 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-var-lock\") pod \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " Mar 12 12:35:02.511180 master-0 kubenswrapper[13984]: I0312 12:35:02.511178 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kubelet-dir\") pod \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " Mar 12 12:35:02.512582 master-0 kubenswrapper[13984]: I0312 12:35:02.511233 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-var-lock" (OuterVolumeSpecName: "var-lock") pod "a40de76a-de63-4e37-b1ab-b7fe767e67ea" (UID: "a40de76a-de63-4e37-b1ab-b7fe767e67ea"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:02.512582 master-0 kubenswrapper[13984]: I0312 12:35:02.511327 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a40de76a-de63-4e37-b1ab-b7fe767e67ea" (UID: "a40de76a-de63-4e37-b1ab-b7fe767e67ea"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:02.512582 master-0 kubenswrapper[13984]: I0312 12:35:02.511735 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:02.512582 master-0 kubenswrapper[13984]: I0312 12:35:02.511759 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:02.612605 master-0 kubenswrapper[13984]: I0312 12:35:02.612520 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kube-api-access\") pod \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\" (UID: \"a40de76a-de63-4e37-b1ab-b7fe767e67ea\") " Mar 12 12:35:02.616341 master-0 kubenswrapper[13984]: I0312 12:35:02.616264 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a40de76a-de63-4e37-b1ab-b7fe767e67ea" (UID: "a40de76a-de63-4e37-b1ab-b7fe767e67ea"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:35:02.623348 master-0 kubenswrapper[13984]: E0312 12:35:02.623194 13984 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/events/console-79db4dd985-zqfm7.189c181b05cbe850\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 12 12:35:02.623348 master-0 kubenswrapper[13984]: &Event{ObjectMeta:{console-79db4dd985-zqfm7.189c181b05cbe850 openshift-console 15222 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-79db4dd985-zqfm7,UID:0485a15a-a4a4-4df7-a74c-d7379f4d01cb,APIVersion:v1,ResourceVersion:15061,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.94:8443/health": dial tcp 10.128.0.94:8443: connect: connection refused Mar 12 12:35:02.623348 master-0 kubenswrapper[13984]: body: Mar 12 12:35:02.623348 master-0 kubenswrapper[13984]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:34:17 +0000 UTC,LastTimestamp:2026-03-12 12:34:57.616847053 +0000 UTC m=+629.814862585,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 12 12:35:02.623348 master-0 kubenswrapper[13984]: > Mar 12 12:35:02.714957 master-0 kubenswrapper[13984]: I0312 12:35:02.714865 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a40de76a-de63-4e37-b1ab-b7fe767e67ea-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:03.025807 master-0 kubenswrapper[13984]: I0312 12:35:03.025666 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:03.027713 master-0 kubenswrapper[13984]: I0312 12:35:03.027649 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 12 12:35:03.027713 master-0 kubenswrapper[13984]: I0312 12:35:03.027663 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"a40de76a-de63-4e37-b1ab-b7fe767e67ea","Type":"ContainerDied","Data":"23301463cb1dfd0157af2fc4f13a78e0792270fbc83c1d54a43c63dea326b9ec"} Mar 12 12:35:03.027713 master-0 kubenswrapper[13984]: I0312 12:35:03.027702 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23301463cb1dfd0157af2fc4f13a78e0792270fbc83c1d54a43c63dea326b9ec" Mar 12 12:35:03.048530 master-0 kubenswrapper[13984]: I0312 12:35:03.048457 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:03.049349 master-0 kubenswrapper[13984]: I0312 12:35:03.049259 13984 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:03.050167 master-0 kubenswrapper[13984]: I0312 12:35:03.050090 13984 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:03.050636 master-0 kubenswrapper[13984]: I0312 12:35:03.050585 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:03.771624 master-0 kubenswrapper[13984]: E0312 12:35:03.771550 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 12 12:35:03.991718 master-0 kubenswrapper[13984]: I0312 12:35:03.991625 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" path="/var/lib/kubelet/pods/cdcecc61ff5eeb08bd2a3ac12599e4f9/volumes" Mar 12 12:35:04.924497 master-0 kubenswrapper[13984]: E0312 12:35:04.923991 13984 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-console/events/console-79db4dd985-zqfm7.189c181b05cbe850\": dial tcp 192.168.32.10:6443: connect: connection refused" event=< Mar 12 12:35:04.924497 master-0 kubenswrapper[13984]: &Event{ObjectMeta:{console-79db4dd985-zqfm7.189c181b05cbe850 openshift-console 15222 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-79db4dd985-zqfm7,UID:0485a15a-a4a4-4df7-a74c-d7379f4d01cb,APIVersion:v1,ResourceVersion:15061,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.94:8443/health": dial tcp 10.128.0.94:8443: connect: connection refused Mar 12 12:35:04.924497 master-0 kubenswrapper[13984]: body: Mar 12 12:35:04.924497 master-0 kubenswrapper[13984]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-12 12:34:17 +0000 UTC,LastTimestamp:2026-03-12 12:34:57.616847053 +0000 UTC m=+629.814862585,Count:5,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 12 12:35:04.924497 master-0 kubenswrapper[13984]: > Mar 12 12:35:05.872008 master-0 kubenswrapper[13984]: I0312 12:35:05.871927 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:35:05.907216 master-0 kubenswrapper[13984]: I0312 12:35:05.907116 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:35:05.909190 master-0 kubenswrapper[13984]: I0312 12:35:05.908946 13984 status_manager.go:851] "Failed to get status for pod" podUID="e5e27787-de34-48dd-8854-79387c59fa6c" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:05.910212 master-0 kubenswrapper[13984]: I0312 12:35:05.910149 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:05.979380 master-0 kubenswrapper[13984]: I0312 12:35:05.979289 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:05.981510 master-0 kubenswrapper[13984]: I0312 12:35:05.981422 13984 status_manager.go:851] "Failed to get status for pod" podUID="e5e27787-de34-48dd-8854-79387c59fa6c" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:05.983064 master-0 kubenswrapper[13984]: I0312 12:35:05.982990 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:06.041948 master-0 kubenswrapper[13984]: I0312 12:35:06.041898 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:06.041948 master-0 kubenswrapper[13984]: I0312 12:35:06.041939 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:06.042697 master-0 kubenswrapper[13984]: E0312 12:35:06.042658 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:06.043316 master-0 kubenswrapper[13984]: I0312 12:35:06.043296 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:06.071669 master-0 kubenswrapper[13984]: W0312 12:35:06.069562 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077dd10388b9e3e48a07382126e86621.slice/crio-b0154d01e41935669e82de9a2a747a8948c27ba6ceffad28a0df499a118acf07 WatchSource:0}: Error finding container b0154d01e41935669e82de9a2a747a8948c27ba6ceffad28a0df499a118acf07: Status 404 returned error can't find the container with id b0154d01e41935669e82de9a2a747a8948c27ba6ceffad28a0df499a118acf07 Mar 12 12:35:06.075708 master-0 kubenswrapper[13984]: I0312 12:35:06.075507 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 12 12:35:06.076456 master-0 kubenswrapper[13984]: I0312 12:35:06.076401 13984 status_manager.go:851] "Failed to get status for pod" podUID="e5e27787-de34-48dd-8854-79387c59fa6c" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:06.076951 master-0 kubenswrapper[13984]: I0312 12:35:06.076903 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:06.973082 master-0 kubenswrapper[13984]: E0312 12:35:06.973024 13984 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 12 12:35:07.047750 master-0 kubenswrapper[13984]: I0312 12:35:07.047688 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 12 12:35:07.047750 master-0 kubenswrapper[13984]: I0312 12:35:07.047690 13984 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Liveness probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 12 12:35:07.048346 master-0 kubenswrapper[13984]: I0312 12:35:07.047766 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 12 12:35:07.048346 master-0 kubenswrapper[13984]: I0312 12:35:07.047814 13984 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 12 12:35:07.056690 master-0 kubenswrapper[13984]: I0312 12:35:07.056644 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d2a6a89fd8fe0c6f59a2124101057324/kube-controller-manager/0.log" Mar 12 12:35:07.056884 master-0 kubenswrapper[13984]: I0312 12:35:07.056703 13984 generic.go:334] "Generic (PLEG): container finished" podID="d2a6a89fd8fe0c6f59a2124101057324" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" exitCode=1 Mar 12 12:35:07.056884 master-0 kubenswrapper[13984]: I0312 12:35:07.056767 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerDied","Data":"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482"} Mar 12 12:35:07.057694 master-0 kubenswrapper[13984]: I0312 12:35:07.057667 13984 scope.go:117] "RemoveContainer" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:35:07.059004 master-0 kubenswrapper[13984]: I0312 12:35:07.058690 13984 status_manager.go:851] "Failed to get status for pod" podUID="d2a6a89fd8fe0c6f59a2124101057324" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:07.059313 master-0 kubenswrapper[13984]: I0312 12:35:07.059274 13984 status_manager.go:851] "Failed to get status for pod" podUID="e5e27787-de34-48dd-8854-79387c59fa6c" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:07.059586 master-0 kubenswrapper[13984]: I0312 12:35:07.059559 13984 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="ba19dfda552090f10cb8a779d761c3fc23186a215ddd58cd3484c4b967fd9944" exitCode=0 Mar 12 12:35:07.059702 master-0 kubenswrapper[13984]: I0312 12:35:07.059584 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerDied","Data":"ba19dfda552090f10cb8a779d761c3fc23186a215ddd58cd3484c4b967fd9944"} Mar 12 12:35:07.059761 master-0 kubenswrapper[13984]: I0312 12:35:07.059708 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"b0154d01e41935669e82de9a2a747a8948c27ba6ceffad28a0df499a118acf07"} Mar 12 12:35:07.060140 master-0 kubenswrapper[13984]: I0312 12:35:07.060098 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:07.060206 master-0 kubenswrapper[13984]: I0312 12:35:07.060155 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:07.060206 master-0 kubenswrapper[13984]: I0312 12:35:07.060175 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:07.060831 master-0 kubenswrapper[13984]: E0312 12:35:07.060762 13984 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:07.061647 master-0 kubenswrapper[13984]: I0312 12:35:07.061614 13984 status_manager.go:851] "Failed to get status for pod" podUID="d2a6a89fd8fe0c6f59a2124101057324" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:07.067853 master-0 kubenswrapper[13984]: I0312 12:35:07.067816 13984 status_manager.go:851] "Failed to get status for pod" podUID="e5e27787-de34-48dd-8854-79387c59fa6c" pod="openshift-monitoring/prometheus-k8s-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/pods/prometheus-k8s-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:07.068613 master-0 kubenswrapper[13984]: I0312 12:35:07.068557 13984 status_manager.go:851] "Failed to get status for pod" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 12 12:35:07.616127 master-0 kubenswrapper[13984]: I0312 12:35:07.616071 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:35:07.616289 master-0 kubenswrapper[13984]: I0312 12:35:07.616128 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:35:08.084609 master-0 kubenswrapper[13984]: I0312 12:35:08.083735 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d2a6a89fd8fe0c6f59a2124101057324/kube-controller-manager/0.log" Mar 12 12:35:08.084609 master-0 kubenswrapper[13984]: I0312 12:35:08.083824 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d2a6a89fd8fe0c6f59a2124101057324","Type":"ContainerStarted","Data":"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f"} Mar 12 12:35:08.097451 master-0 kubenswrapper[13984]: I0312 12:35:08.097329 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"b3f91b03f0645e56a9f6a5b68130af5e1c791845dc29d1313c86ce67f0665399"} Mar 12 12:35:08.097451 master-0 kubenswrapper[13984]: I0312 12:35:08.097398 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"0081806cfac2a171d56d0f080894d65f680abecd6f7ff50d7a03cca935110c3b"} Mar 12 12:35:08.097451 master-0 kubenswrapper[13984]: I0312 12:35:08.097438 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"f5e31551722028737b78090cc80df2108241678311f3641a22b50cf1c3462496"} Mar 12 12:35:08.768015 master-0 kubenswrapper[13984]: I0312 12:35:08.767963 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:35:08.768015 master-0 kubenswrapper[13984]: I0312 12:35:08.768020 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:35:09.120687 master-0 kubenswrapper[13984]: I0312 12:35:09.115306 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"e4208cad23eb5268c83c88771561e81d9f4a357b7316fe32dd6768b38021e593"} Mar 12 12:35:09.120687 master-0 kubenswrapper[13984]: I0312 12:35:09.115373 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"e4af402878dd4c164e92e50655ff72cea2dcbb1f8c3c310f794f80770d6465fd"} Mar 12 12:35:09.120687 master-0 kubenswrapper[13984]: I0312 12:35:09.115699 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:09.120687 master-0 kubenswrapper[13984]: I0312 12:35:09.115716 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:09.120687 master-0 kubenswrapper[13984]: I0312 12:35:09.116095 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:11.044508 master-0 kubenswrapper[13984]: I0312 12:35:11.044406 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:11.045344 master-0 kubenswrapper[13984]: I0312 12:35:11.044539 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:11.051735 master-0 kubenswrapper[13984]: I0312 12:35:11.051665 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:14.219396 master-0 kubenswrapper[13984]: I0312 12:35:14.219316 13984 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:15.173140 master-0 kubenswrapper[13984]: I0312 12:35:15.173021 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:15.173140 master-0 kubenswrapper[13984]: I0312 12:35:15.173066 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:15.176586 master-0 kubenswrapper[13984]: I0312 12:35:15.176553 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:15.286198 master-0 kubenswrapper[13984]: I0312 12:35:15.286133 13984 request.go:700] Waited for 1.001789271s, retries: 1, retry-after: 5s - retry-reason: 503 - request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-cloud-controller-manager-operator/configmaps?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dkube-rbac-proxy&resourceVersion=15166&timeout=36m6s&timeoutSeconds=2166&watch=true Mar 12 12:35:16.286411 master-0 kubenswrapper[13984]: I0312 12:35:16.286363 13984 request.go:700] Waited for 1.553673535s, retries: 1, retry-after: 5s - retry-reason: 503 - request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/secrets?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dalertmanager-main-dockercfg-24x8c&resourceVersion=14938&timeout=58m32s&timeoutSeconds=3512&watch=true Mar 12 12:35:17.047285 master-0 kubenswrapper[13984]: I0312 12:35:17.047068 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:35:17.047285 master-0 kubenswrapper[13984]: I0312 12:35:17.047265 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:35:17.051579 master-0 kubenswrapper[13984]: I0312 12:35:17.050863 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:35:17.204587 master-0 kubenswrapper[13984]: I0312 12:35:17.204528 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:17.204587 master-0 kubenswrapper[13984]: I0312 12:35:17.204564 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="734753df-f33f-443a-945d-8aadf6f4f875" Mar 12 12:35:17.210942 master-0 kubenswrapper[13984]: I0312 12:35:17.210758 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:35:17.305961 master-0 kubenswrapper[13984]: I0312 12:35:17.305827 13984 request.go:700] Waited for 2.017169487s, retries: 1, retry-after: 5s - retry-reason: 503 - request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/configmaps?allowWatchBookmarks=true&fieldSelector=metadata.name%3Dv4-0-config-system-service-ca&resourceVersion=15214&timeout=41m18s&timeoutSeconds=2478&watch=true Mar 12 12:35:17.572091 master-0 kubenswrapper[13984]: I0312 12:35:17.571971 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="fa7f393a-1b86-4d7e-8436-0c70d4315a95" Mar 12 12:35:17.616194 master-0 kubenswrapper[13984]: I0312 12:35:17.616132 13984 patch_prober.go:28] interesting pod/console-79db4dd985-zqfm7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" start-of-body= Mar 12 12:35:17.616394 master-0 kubenswrapper[13984]: I0312 12:35:17.616216 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" probeResult="failure" output="Get \"https://10.128.0.94:8443/health\": dial tcp 10.128.0.94:8443: connect: connection refused" Mar 12 12:35:17.684392 master-0 kubenswrapper[13984]: I0312 12:35:17.684322 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 12 12:35:17.715073 master-0 kubenswrapper[13984]: I0312 12:35:17.715036 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 12 12:35:17.988582 master-0 kubenswrapper[13984]: I0312 12:35:17.988535 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 12 12:35:18.036376 master-0 kubenswrapper[13984]: I0312 12:35:18.036330 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-8fdxz" Mar 12 12:35:18.232586 master-0 kubenswrapper[13984]: I0312 12:35:18.232550 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 12 12:35:18.768054 master-0 kubenswrapper[13984]: I0312 12:35:18.768017 13984 patch_prober.go:28] interesting pod/console-6686f9695f-gkbkr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" start-of-body= Mar 12 12:35:18.768678 master-0 kubenswrapper[13984]: I0312 12:35:18.768647 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" probeResult="failure" output="Get \"https://10.128.0.89:8443/health\": dial tcp 10.128.0.89:8443: connect: connection refused" Mar 12 12:35:18.811082 master-0 kubenswrapper[13984]: I0312 12:35:18.811050 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 12 12:35:18.927532 master-0 kubenswrapper[13984]: I0312 12:35:18.927457 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 12 12:35:19.088524 master-0 kubenswrapper[13984]: I0312 12:35:19.088342 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 12 12:35:19.100752 master-0 kubenswrapper[13984]: I0312 12:35:19.100680 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 12 12:35:19.103036 master-0 kubenswrapper[13984]: I0312 12:35:19.102984 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 12 12:35:19.640317 master-0 kubenswrapper[13984]: I0312 12:35:19.640269 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 12 12:35:19.701460 master-0 kubenswrapper[13984]: I0312 12:35:19.701385 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 12 12:35:19.755326 master-0 kubenswrapper[13984]: I0312 12:35:19.755284 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 12 12:35:19.826255 master-0 kubenswrapper[13984]: I0312 12:35:19.826181 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 12 12:35:20.067002 master-0 kubenswrapper[13984]: I0312 12:35:20.066949 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 12 12:35:20.107861 master-0 kubenswrapper[13984]: I0312 12:35:20.107815 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 12 12:35:20.134674 master-0 kubenswrapper[13984]: I0312 12:35:20.134630 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 12 12:35:20.550500 master-0 kubenswrapper[13984]: I0312 12:35:20.550425 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 12 12:35:20.681764 master-0 kubenswrapper[13984]: I0312 12:35:20.681712 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 12 12:35:20.702706 master-0 kubenswrapper[13984]: I0312 12:35:20.702646 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 12 12:35:20.814089 master-0 kubenswrapper[13984]: I0312 12:35:20.813963 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 12 12:35:20.853820 master-0 kubenswrapper[13984]: I0312 12:35:20.853776 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-dockercfg-k9zjm" Mar 12 12:35:21.021358 master-0 kubenswrapper[13984]: I0312 12:35:21.021317 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 12 12:35:21.056309 master-0 kubenswrapper[13984]: I0312 12:35:21.056270 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 12 12:35:21.088101 master-0 kubenswrapper[13984]: I0312 12:35:21.087995 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 12 12:35:21.112638 master-0 kubenswrapper[13984]: I0312 12:35:21.112589 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 12 12:35:21.113064 master-0 kubenswrapper[13984]: I0312 12:35:21.113013 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 12 12:35:21.138514 master-0 kubenswrapper[13984]: I0312 12:35:21.135979 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 12 12:35:21.204032 master-0 kubenswrapper[13984]: I0312 12:35:21.203986 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 12 12:35:21.205353 master-0 kubenswrapper[13984]: I0312 12:35:21.205301 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 12 12:35:21.217219 master-0 kubenswrapper[13984]: I0312 12:35:21.217177 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-29i444egqnkse" Mar 12 12:35:21.228190 master-0 kubenswrapper[13984]: I0312 12:35:21.228121 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 12 12:35:21.233269 master-0 kubenswrapper[13984]: I0312 12:35:21.233227 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-hsrzl" Mar 12 12:35:21.236604 master-0 kubenswrapper[13984]: I0312 12:35:21.236574 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 12 12:35:21.296831 master-0 kubenswrapper[13984]: I0312 12:35:21.296774 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 12 12:35:21.450136 master-0 kubenswrapper[13984]: I0312 12:35:21.450085 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 12 12:35:21.500202 master-0 kubenswrapper[13984]: I0312 12:35:21.500160 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-rzflr" Mar 12 12:35:21.510076 master-0 kubenswrapper[13984]: I0312 12:35:21.510033 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 12 12:35:21.521210 master-0 kubenswrapper[13984]: I0312 12:35:21.521178 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 12 12:35:21.563026 master-0 kubenswrapper[13984]: I0312 12:35:21.562995 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 12 12:35:21.568525 master-0 kubenswrapper[13984]: I0312 12:35:21.568453 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 12 12:35:21.607439 master-0 kubenswrapper[13984]: I0312 12:35:21.607384 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 12 12:35:21.678365 master-0 kubenswrapper[13984]: I0312 12:35:21.678322 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 12 12:35:21.715249 master-0 kubenswrapper[13984]: I0312 12:35:21.715151 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 12 12:35:21.715408 master-0 kubenswrapper[13984]: I0312 12:35:21.715305 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 12 12:35:21.726730 master-0 kubenswrapper[13984]: I0312 12:35:21.726682 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-vhqj5" Mar 12 12:35:21.751628 master-0 kubenswrapper[13984]: I0312 12:35:21.751567 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-kpcrs" Mar 12 12:35:21.759501 master-0 kubenswrapper[13984]: I0312 12:35:21.759447 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 12 12:35:21.763700 master-0 kubenswrapper[13984]: I0312 12:35:21.763644 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 12 12:35:21.771618 master-0 kubenswrapper[13984]: I0312 12:35:21.770972 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 12 12:35:21.799080 master-0 kubenswrapper[13984]: I0312 12:35:21.799033 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 12 12:35:21.805455 master-0 kubenswrapper[13984]: I0312 12:35:21.805382 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 12 12:35:21.813342 master-0 kubenswrapper[13984]: I0312 12:35:21.813274 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-2blvc" Mar 12 12:35:21.823501 master-0 kubenswrapper[13984]: I0312 12:35:21.823456 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 12 12:35:21.831317 master-0 kubenswrapper[13984]: I0312 12:35:21.831270 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 12 12:35:21.841132 master-0 kubenswrapper[13984]: I0312 12:35:21.841090 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 12 12:35:21.866973 master-0 kubenswrapper[13984]: I0312 12:35:21.866930 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 12 12:35:21.898889 master-0 kubenswrapper[13984]: I0312 12:35:21.898846 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 12 12:35:21.899111 master-0 kubenswrapper[13984]: I0312 12:35:21.898863 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 12 12:35:21.934936 master-0 kubenswrapper[13984]: I0312 12:35:21.934887 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 12 12:35:21.939214 master-0 kubenswrapper[13984]: I0312 12:35:21.939146 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-lqvhq" Mar 12 12:35:21.952386 master-0 kubenswrapper[13984]: I0312 12:35:21.952341 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 12 12:35:21.954800 master-0 kubenswrapper[13984]: I0312 12:35:21.954759 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-hjhdl" Mar 12 12:35:21.960688 master-0 kubenswrapper[13984]: I0312 12:35:21.960628 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 12 12:35:22.007331 master-0 kubenswrapper[13984]: I0312 12:35:22.005848 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 12 12:35:22.045983 master-0 kubenswrapper[13984]: I0312 12:35:22.045881 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 12 12:35:22.088712 master-0 kubenswrapper[13984]: I0312 12:35:22.088636 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 12 12:35:22.175611 master-0 kubenswrapper[13984]: I0312 12:35:22.174554 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 12 12:35:22.188885 master-0 kubenswrapper[13984]: I0312 12:35:22.188836 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 12 12:35:22.190902 master-0 kubenswrapper[13984]: I0312 12:35:22.190878 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 12 12:35:22.230560 master-0 kubenswrapper[13984]: I0312 12:35:22.230526 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 12 12:35:22.267988 master-0 kubenswrapper[13984]: I0312 12:35:22.267841 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 12 12:35:22.299393 master-0 kubenswrapper[13984]: I0312 12:35:22.299209 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 12 12:35:22.326389 master-0 kubenswrapper[13984]: I0312 12:35:22.324835 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:35:22.365636 master-0 kubenswrapper[13984]: I0312 12:35:22.365575 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 12 12:35:22.381007 master-0 kubenswrapper[13984]: I0312 12:35:22.380942 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 12 12:35:22.384204 master-0 kubenswrapper[13984]: I0312 12:35:22.384164 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 12 12:35:22.465826 master-0 kubenswrapper[13984]: I0312 12:35:22.465751 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 12 12:35:22.474242 master-0 kubenswrapper[13984]: I0312 12:35:22.474163 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 12 12:35:22.491083 master-0 kubenswrapper[13984]: I0312 12:35:22.491029 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 12 12:35:22.630409 master-0 kubenswrapper[13984]: I0312 12:35:22.630283 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 12 12:35:22.645224 master-0 kubenswrapper[13984]: I0312 12:35:22.645172 13984 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 12 12:35:22.646537 master-0 kubenswrapper[13984]: I0312 12:35:22.646494 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 12 12:35:22.651343 master-0 kubenswrapper[13984]: I0312 12:35:22.651281 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 12 12:35:22.651439 master-0 kubenswrapper[13984]: I0312 12:35:22.651353 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 12 12:35:22.666514 master-0 kubenswrapper[13984]: I0312 12:35:22.659731 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 12 12:35:22.678460 master-0 kubenswrapper[13984]: I0312 12:35:22.678310 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-l6bmn" Mar 12 12:35:22.702596 master-0 kubenswrapper[13984]: I0312 12:35:22.702540 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-24x8c" Mar 12 12:35:22.731081 master-0 kubenswrapper[13984]: I0312 12:35:22.731038 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 12 12:35:22.753181 master-0 kubenswrapper[13984]: I0312 12:35:22.753145 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 12 12:35:22.753389 master-0 kubenswrapper[13984]: I0312 12:35:22.753310 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 12 12:35:22.778747 master-0 kubenswrapper[13984]: I0312 12:35:22.778707 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:35:22.787360 master-0 kubenswrapper[13984]: I0312 12:35:22.787321 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 12 12:35:22.839365 master-0 kubenswrapper[13984]: I0312 12:35:22.839306 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 12 12:35:22.887570 master-0 kubenswrapper[13984]: I0312 12:35:22.886970 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 12 12:35:22.942215 master-0 kubenswrapper[13984]: I0312 12:35:22.942159 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 12 12:35:22.951446 master-0 kubenswrapper[13984]: I0312 12:35:22.951396 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 12 12:35:22.983927 master-0 kubenswrapper[13984]: I0312 12:35:22.983861 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 12 12:35:22.986803 master-0 kubenswrapper[13984]: I0312 12:35:22.986755 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 12 12:35:23.053177 master-0 kubenswrapper[13984]: I0312 12:35:23.053079 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 12 12:35:23.060569 master-0 kubenswrapper[13984]: I0312 12:35:23.060106 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 12 12:35:23.060569 master-0 kubenswrapper[13984]: I0312 12:35:23.060311 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:35:23.095858 master-0 kubenswrapper[13984]: I0312 12:35:23.095769 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 12 12:35:23.095858 master-0 kubenswrapper[13984]: I0312 12:35:23.095776 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 12 12:35:23.124875 master-0 kubenswrapper[13984]: I0312 12:35:23.124829 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 12 12:35:23.147674 master-0 kubenswrapper[13984]: I0312 12:35:23.147537 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-ljrjz" Mar 12 12:35:23.173617 master-0 kubenswrapper[13984]: I0312 12:35:23.173572 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 12 12:35:23.193229 master-0 kubenswrapper[13984]: I0312 12:35:23.193159 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 12 12:35:23.198288 master-0 kubenswrapper[13984]: I0312 12:35:23.198252 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 12 12:35:23.237511 master-0 kubenswrapper[13984]: I0312 12:35:23.237431 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 12 12:35:23.241637 master-0 kubenswrapper[13984]: I0312 12:35:23.241601 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 12 12:35:23.247629 master-0 kubenswrapper[13984]: I0312 12:35:23.247566 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=9.247543302 podStartE2EDuration="9.247543302s" podCreationTimestamp="2026-03-12 12:35:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:35:22.872924534 +0000 UTC m=+655.070940046" watchObservedRunningTime="2026-03-12 12:35:23.247543302 +0000 UTC m=+655.445558814" Mar 12 12:35:23.266755 master-0 kubenswrapper[13984]: I0312 12:35:23.266684 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 12 12:35:23.279724 master-0 kubenswrapper[13984]: I0312 12:35:23.279684 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 12 12:35:23.302675 master-0 kubenswrapper[13984]: I0312 12:35:23.302621 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 12 12:35:23.359824 master-0 kubenswrapper[13984]: I0312 12:35:23.359771 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 12 12:35:23.370152 master-0 kubenswrapper[13984]: I0312 12:35:23.370100 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 12 12:35:23.442615 master-0 kubenswrapper[13984]: I0312 12:35:23.442577 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-dk5lp" Mar 12 12:35:23.478392 master-0 kubenswrapper[13984]: I0312 12:35:23.478336 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 12 12:35:23.584793 master-0 kubenswrapper[13984]: I0312 12:35:23.584752 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 12 12:35:23.586048 master-0 kubenswrapper[13984]: I0312 12:35:23.586032 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 12 12:35:23.601306 master-0 kubenswrapper[13984]: I0312 12:35:23.601260 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 12 12:35:23.605816 master-0 kubenswrapper[13984]: I0312 12:35:23.605757 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 12 12:35:23.611496 master-0 kubenswrapper[13984]: I0312 12:35:23.611440 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 12 12:35:23.683598 master-0 kubenswrapper[13984]: I0312 12:35:23.683530 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 12 12:35:23.688352 master-0 kubenswrapper[13984]: I0312 12:35:23.688312 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 12 12:35:23.706162 master-0 kubenswrapper[13984]: I0312 12:35:23.706072 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 12 12:35:23.735906 master-0 kubenswrapper[13984]: I0312 12:35:23.735833 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 12 12:35:23.767791 master-0 kubenswrapper[13984]: I0312 12:35:23.767750 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 12 12:35:23.805629 master-0 kubenswrapper[13984]: I0312 12:35:23.805592 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-cr5kf" Mar 12 12:35:23.816332 master-0 kubenswrapper[13984]: I0312 12:35:23.816295 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 12 12:35:23.817253 master-0 kubenswrapper[13984]: I0312 12:35:23.817239 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-spkrx" Mar 12 12:35:23.867884 master-0 kubenswrapper[13984]: I0312 12:35:23.867839 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 12 12:35:23.869259 master-0 kubenswrapper[13984]: I0312 12:35:23.869223 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 12 12:35:23.875896 master-0 kubenswrapper[13984]: I0312 12:35:23.875851 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 12 12:35:23.881395 master-0 kubenswrapper[13984]: I0312 12:35:23.881364 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 12 12:35:23.896094 master-0 kubenswrapper[13984]: I0312 12:35:23.896062 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 12 12:35:23.902703 master-0 kubenswrapper[13984]: I0312 12:35:23.902673 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 12 12:35:23.912822 master-0 kubenswrapper[13984]: I0312 12:35:23.912747 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 12 12:35:23.966037 master-0 kubenswrapper[13984]: I0312 12:35:23.965891 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 12 12:35:23.996812 master-0 kubenswrapper[13984]: I0312 12:35:23.996110 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 12 12:35:24.036648 master-0 kubenswrapper[13984]: I0312 12:35:24.036593 13984 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 12 12:35:24.046811 master-0 kubenswrapper[13984]: I0312 12:35:24.046764 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 12 12:35:24.079353 master-0 kubenswrapper[13984]: I0312 12:35:24.079287 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 12 12:35:24.113366 master-0 kubenswrapper[13984]: I0312 12:35:24.113301 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 12 12:35:24.181172 master-0 kubenswrapper[13984]: I0312 12:35:24.181109 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 12 12:35:24.224550 master-0 kubenswrapper[13984]: I0312 12:35:24.224419 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 12 12:35:24.225354 master-0 kubenswrapper[13984]: I0312 12:35:24.225325 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 12 12:35:24.230644 master-0 kubenswrapper[13984]: I0312 12:35:24.230595 13984 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 12 12:35:24.240827 master-0 kubenswrapper[13984]: I0312 12:35:24.240770 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 12 12:35:24.343974 master-0 kubenswrapper[13984]: I0312 12:35:24.343924 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-c479d" Mar 12 12:35:24.381035 master-0 kubenswrapper[13984]: I0312 12:35:24.380945 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 12 12:35:24.407387 master-0 kubenswrapper[13984]: I0312 12:35:24.407328 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 12 12:35:24.407638 master-0 kubenswrapper[13984]: I0312 12:35:24.407616 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" containerID="cri-o://b6f1cba7d9b8571df4e1cbef3aba679da425e257dfae8755a672b28c421b9453" gracePeriod=5 Mar 12 12:35:24.427052 master-0 kubenswrapper[13984]: I0312 12:35:24.426994 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 12 12:35:24.428165 master-0 kubenswrapper[13984]: I0312 12:35:24.428130 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 12 12:35:24.485453 master-0 kubenswrapper[13984]: I0312 12:35:24.485317 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 12 12:35:24.503575 master-0 kubenswrapper[13984]: I0312 12:35:24.503522 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 12 12:35:24.512235 master-0 kubenswrapper[13984]: I0312 12:35:24.512191 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 12 12:35:24.512806 master-0 kubenswrapper[13984]: I0312 12:35:24.512773 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 12 12:35:24.553784 master-0 kubenswrapper[13984]: I0312 12:35:24.553724 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 12 12:35:24.593169 master-0 kubenswrapper[13984]: I0312 12:35:24.593091 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 12 12:35:24.614697 master-0 kubenswrapper[13984]: I0312 12:35:24.614613 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-brf42" Mar 12 12:35:24.635079 master-0 kubenswrapper[13984]: I0312 12:35:24.635021 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 12 12:35:24.657806 master-0 kubenswrapper[13984]: I0312 12:35:24.657765 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 12 12:35:24.659568 master-0 kubenswrapper[13984]: I0312 12:35:24.659466 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 12 12:35:24.672864 master-0 kubenswrapper[13984]: I0312 12:35:24.672820 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 12 12:35:24.694223 master-0 kubenswrapper[13984]: I0312 12:35:24.694191 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 12 12:35:24.783648 master-0 kubenswrapper[13984]: I0312 12:35:24.783535 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 12 12:35:24.789669 master-0 kubenswrapper[13984]: I0312 12:35:24.789646 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 12 12:35:24.809332 master-0 kubenswrapper[13984]: I0312 12:35:24.809289 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 12 12:35:24.829804 master-0 kubenswrapper[13984]: I0312 12:35:24.829772 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 12 12:35:24.845551 master-0 kubenswrapper[13984]: I0312 12:35:24.845515 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 12 12:35:24.892116 master-0 kubenswrapper[13984]: I0312 12:35:24.892053 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 12 12:35:24.947819 master-0 kubenswrapper[13984]: I0312 12:35:24.947770 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 12 12:35:24.949504 master-0 kubenswrapper[13984]: I0312 12:35:24.949418 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 12 12:35:25.045560 master-0 kubenswrapper[13984]: I0312 12:35:25.045335 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 12 12:35:25.090759 master-0 kubenswrapper[13984]: I0312 12:35:25.090707 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 12 12:35:25.115659 master-0 kubenswrapper[13984]: I0312 12:35:25.115599 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 12 12:35:25.122869 master-0 kubenswrapper[13984]: I0312 12:35:25.122817 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 12 12:35:25.138397 master-0 kubenswrapper[13984]: I0312 12:35:25.138304 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 12 12:35:25.161659 master-0 kubenswrapper[13984]: I0312 12:35:25.161588 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 12 12:35:25.185433 master-0 kubenswrapper[13984]: I0312 12:35:25.185356 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 12 12:35:25.282959 master-0 kubenswrapper[13984]: I0312 12:35:25.282902 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 12 12:35:25.289417 master-0 kubenswrapper[13984]: I0312 12:35:25.289365 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-37hhdc2hpri03" Mar 12 12:35:25.307373 master-0 kubenswrapper[13984]: I0312 12:35:25.307181 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 12 12:35:25.319624 master-0 kubenswrapper[13984]: I0312 12:35:25.319576 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 12 12:35:25.320111 master-0 kubenswrapper[13984]: I0312 12:35:25.320084 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 12 12:35:25.322792 master-0 kubenswrapper[13984]: I0312 12:35:25.322751 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 12 12:35:25.338243 master-0 kubenswrapper[13984]: I0312 12:35:25.338185 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 12 12:35:25.341185 master-0 kubenswrapper[13984]: I0312 12:35:25.341157 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-8vnkv64t8t82h" Mar 12 12:35:25.370056 master-0 kubenswrapper[13984]: I0312 12:35:25.370013 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 12 12:35:25.378811 master-0 kubenswrapper[13984]: I0312 12:35:25.378744 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 12 12:35:25.421053 master-0 kubenswrapper[13984]: I0312 12:35:25.419393 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 12 12:35:25.530703 master-0 kubenswrapper[13984]: I0312 12:35:25.530643 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 12 12:35:25.538906 master-0 kubenswrapper[13984]: I0312 12:35:25.538862 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 12 12:35:25.550866 master-0 kubenswrapper[13984]: I0312 12:35:25.550840 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 12 12:35:25.556365 master-0 kubenswrapper[13984]: I0312 12:35:25.556331 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 12 12:35:25.565404 master-0 kubenswrapper[13984]: I0312 12:35:25.565329 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-7skrh" Mar 12 12:35:25.589675 master-0 kubenswrapper[13984]: I0312 12:35:25.589634 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-dsw4r" Mar 12 12:35:25.616041 master-0 kubenswrapper[13984]: I0312 12:35:25.616012 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 12 12:35:25.620360 master-0 kubenswrapper[13984]: I0312 12:35:25.620314 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 12 12:35:25.687685 master-0 kubenswrapper[13984]: I0312 12:35:25.687635 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 12 12:35:25.699864 master-0 kubenswrapper[13984]: I0312 12:35:25.699816 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 12 12:35:25.702747 master-0 kubenswrapper[13984]: I0312 12:35:25.702723 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 12 12:35:25.718089 master-0 kubenswrapper[13984]: I0312 12:35:25.718061 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 12 12:35:25.724267 master-0 kubenswrapper[13984]: I0312 12:35:25.724241 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 12 12:35:25.733111 master-0 kubenswrapper[13984]: I0312 12:35:25.733091 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 12 12:35:25.747544 master-0 kubenswrapper[13984]: I0312 12:35:25.747512 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 12 12:35:25.809686 master-0 kubenswrapper[13984]: I0312 12:35:25.809523 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-srrbs" Mar 12 12:35:25.828205 master-0 kubenswrapper[13984]: I0312 12:35:25.828071 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 12 12:35:25.847364 master-0 kubenswrapper[13984]: I0312 12:35:25.847320 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 12 12:35:25.862741 master-0 kubenswrapper[13984]: I0312 12:35:25.862657 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 12 12:35:25.874189 master-0 kubenswrapper[13984]: I0312 12:35:25.874127 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 12 12:35:25.900773 master-0 kubenswrapper[13984]: I0312 12:35:25.900732 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 12 12:35:25.905651 master-0 kubenswrapper[13984]: I0312 12:35:25.905615 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 12 12:35:25.946174 master-0 kubenswrapper[13984]: I0312 12:35:25.946092 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 12 12:35:26.095751 master-0 kubenswrapper[13984]: I0312 12:35:26.095628 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 12 12:35:26.103331 master-0 kubenswrapper[13984]: I0312 12:35:26.102471 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 12 12:35:26.171278 master-0 kubenswrapper[13984]: I0312 12:35:26.171243 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-lrwcw" Mar 12 12:35:26.172515 master-0 kubenswrapper[13984]: I0312 12:35:26.172496 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-pbjx7" Mar 12 12:35:26.182511 master-0 kubenswrapper[13984]: I0312 12:35:26.182436 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-fc6g9" Mar 12 12:35:26.183280 master-0 kubenswrapper[13984]: I0312 12:35:26.183247 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 12 12:35:26.234911 master-0 kubenswrapper[13984]: I0312 12:35:26.234768 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 12 12:35:26.255779 master-0 kubenswrapper[13984]: I0312 12:35:26.255726 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 12 12:35:26.270195 master-0 kubenswrapper[13984]: I0312 12:35:26.270147 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 12 12:35:26.275442 master-0 kubenswrapper[13984]: I0312 12:35:26.275227 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-drmbk" Mar 12 12:35:26.323359 master-0 kubenswrapper[13984]: I0312 12:35:26.323312 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 12 12:35:26.334557 master-0 kubenswrapper[13984]: I0312 12:35:26.334495 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 12 12:35:26.362621 master-0 kubenswrapper[13984]: I0312 12:35:26.362415 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 12 12:35:26.373416 master-0 kubenswrapper[13984]: I0312 12:35:26.373342 13984 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 12 12:35:26.384144 master-0 kubenswrapper[13984]: I0312 12:35:26.384101 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 12 12:35:26.388632 master-0 kubenswrapper[13984]: I0312 12:35:26.388363 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 12 12:35:26.391863 master-0 kubenswrapper[13984]: I0312 12:35:26.391842 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 12 12:35:26.472598 master-0 kubenswrapper[13984]: I0312 12:35:26.472554 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 12 12:35:26.483296 master-0 kubenswrapper[13984]: I0312 12:35:26.483260 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 12 12:35:26.517049 master-0 kubenswrapper[13984]: I0312 12:35:26.517007 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 12 12:35:26.540444 master-0 kubenswrapper[13984]: I0312 12:35:26.540407 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 12 12:35:26.568294 master-0 kubenswrapper[13984]: I0312 12:35:26.568255 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 12 12:35:26.570294 master-0 kubenswrapper[13984]: I0312 12:35:26.570274 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 12 12:35:26.621844 master-0 kubenswrapper[13984]: I0312 12:35:26.621750 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 12 12:35:26.623114 master-0 kubenswrapper[13984]: I0312 12:35:26.623069 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 12 12:35:26.623678 master-0 kubenswrapper[13984]: I0312 12:35:26.623594 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 12 12:35:26.670368 master-0 kubenswrapper[13984]: I0312 12:35:26.670306 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 12 12:35:26.706777 master-0 kubenswrapper[13984]: I0312 12:35:26.706734 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 12 12:35:26.725202 master-0 kubenswrapper[13984]: I0312 12:35:26.725145 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 12 12:35:26.730079 master-0 kubenswrapper[13984]: I0312 12:35:26.730047 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 12 12:35:26.742309 master-0 kubenswrapper[13984]: I0312 12:35:26.742256 13984 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 12 12:35:26.744130 master-0 kubenswrapper[13984]: I0312 12:35:26.744100 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 12 12:35:26.750827 master-0 kubenswrapper[13984]: I0312 12:35:26.750799 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 12 12:35:26.867432 master-0 kubenswrapper[13984]: I0312 12:35:26.867362 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-pmfnh" Mar 12 12:35:26.872734 master-0 kubenswrapper[13984]: I0312 12:35:26.872648 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-cmdtw" Mar 12 12:35:26.874196 master-0 kubenswrapper[13984]: I0312 12:35:26.874154 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 12 12:35:26.890121 master-0 kubenswrapper[13984]: I0312 12:35:26.890103 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-6r5d5" Mar 12 12:35:26.898939 master-0 kubenswrapper[13984]: I0312 12:35:26.898897 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 12 12:35:26.913470 master-0 kubenswrapper[13984]: I0312 12:35:26.913423 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 12 12:35:26.923125 master-0 kubenswrapper[13984]: I0312 12:35:26.923076 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 12 12:35:26.962094 master-0 kubenswrapper[13984]: I0312 12:35:26.962016 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 12 12:35:26.975864 master-0 kubenswrapper[13984]: I0312 12:35:26.975786 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 12 12:35:27.008448 master-0 kubenswrapper[13984]: I0312 12:35:27.008369 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 12 12:35:27.024991 master-0 kubenswrapper[13984]: I0312 12:35:27.024762 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-qwrgw" Mar 12 12:35:27.035673 master-0 kubenswrapper[13984]: I0312 12:35:27.035434 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 12 12:35:27.087085 master-0 kubenswrapper[13984]: I0312 12:35:27.087012 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dbn77" Mar 12 12:35:27.121307 master-0 kubenswrapper[13984]: I0312 12:35:27.121217 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 12 12:35:27.185119 master-0 kubenswrapper[13984]: I0312 12:35:27.184991 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 12 12:35:27.277134 master-0 kubenswrapper[13984]: I0312 12:35:27.277069 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 12 12:35:27.305234 master-0 kubenswrapper[13984]: I0312 12:35:27.305166 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 12 12:35:27.329755 master-0 kubenswrapper[13984]: I0312 12:35:27.329697 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 12 12:35:27.400914 master-0 kubenswrapper[13984]: I0312 12:35:27.397897 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 12 12:35:27.400914 master-0 kubenswrapper[13984]: I0312 12:35:27.398399 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 12 12:35:27.474650 master-0 kubenswrapper[13984]: I0312 12:35:27.474592 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 12 12:35:27.486683 master-0 kubenswrapper[13984]: I0312 12:35:27.486217 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:35:27.491982 master-0 kubenswrapper[13984]: I0312 12:35:27.491923 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 12 12:35:27.499727 master-0 kubenswrapper[13984]: I0312 12:35:27.499686 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 12 12:35:27.533179 master-0 kubenswrapper[13984]: I0312 12:35:27.533133 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 12 12:35:27.533894 master-0 kubenswrapper[13984]: I0312 12:35:27.533840 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 12 12:35:27.579280 master-0 kubenswrapper[13984]: I0312 12:35:27.579207 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-5jzsx" Mar 12 12:35:27.595818 master-0 kubenswrapper[13984]: I0312 12:35:27.595769 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 12 12:35:27.597344 master-0 kubenswrapper[13984]: I0312 12:35:27.597302 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 12 12:35:27.624463 master-0 kubenswrapper[13984]: I0312 12:35:27.624403 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:35:27.630321 master-0 kubenswrapper[13984]: I0312 12:35:27.630270 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 12 12:35:27.632245 master-0 kubenswrapper[13984]: I0312 12:35:27.632186 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:35:27.675568 master-0 kubenswrapper[13984]: I0312 12:35:27.675450 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 12 12:35:27.680690 master-0 kubenswrapper[13984]: I0312 12:35:27.680633 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 12 12:35:27.684113 master-0 kubenswrapper[13984]: I0312 12:35:27.684081 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 12 12:35:27.696038 master-0 kubenswrapper[13984]: I0312 12:35:27.695994 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 12 12:35:27.711542 master-0 kubenswrapper[13984]: I0312 12:35:27.711435 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 12 12:35:27.875455 master-0 kubenswrapper[13984]: I0312 12:35:27.871754 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 12 12:35:27.928241 master-0 kubenswrapper[13984]: I0312 12:35:27.928164 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 12 12:35:28.172994 master-0 kubenswrapper[13984]: I0312 12:35:28.172884 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 12 12:35:28.293749 master-0 kubenswrapper[13984]: I0312 12:35:28.293690 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 12 12:35:28.446916 master-0 kubenswrapper[13984]: I0312 12:35:28.446855 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 12 12:35:28.477042 master-0 kubenswrapper[13984]: I0312 12:35:28.476990 13984 scope.go:117] "RemoveContainer" containerID="f0f3c0b6faeeb349351f1b371244c103cf179f81f4ae7b1577ee82387a636818" Mar 12 12:35:28.497700 master-0 kubenswrapper[13984]: I0312 12:35:28.497653 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 12 12:35:28.497971 master-0 kubenswrapper[13984]: I0312 12:35:28.497942 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 12 12:35:28.501954 master-0 kubenswrapper[13984]: I0312 12:35:28.501926 13984 scope.go:117] "RemoveContainer" containerID="700614ee74998a2f22b9ab087ce7bf59a848008e38ccd5dd9ed6bc2ca56e75b9" Mar 12 12:35:28.530603 master-0 kubenswrapper[13984]: I0312 12:35:28.530537 13984 scope.go:117] "RemoveContainer" containerID="a3e42264f81889f53565ef287ba1a41d67422c625a1e231bc34f310555991c94" Mar 12 12:35:28.556097 master-0 kubenswrapper[13984]: I0312 12:35:28.556026 13984 scope.go:117] "RemoveContainer" containerID="dbc67a0df6d005a6dc51cf6a3278cefd084341f566051f53aeef112699601de2" Mar 12 12:35:28.569973 master-0 kubenswrapper[13984]: I0312 12:35:28.569947 13984 scope.go:117] "RemoveContainer" containerID="f6fe46052f0fc88bb12a846f99f359af9fcb5cab5db347f3b65949fb3346c632" Mar 12 12:35:28.588532 master-0 kubenswrapper[13984]: I0312 12:35:28.588488 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-6rn7c" Mar 12 12:35:28.600984 master-0 kubenswrapper[13984]: I0312 12:35:28.600946 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 12 12:35:28.678645 master-0 kubenswrapper[13984]: I0312 12:35:28.678561 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 12 12:35:28.784113 master-0 kubenswrapper[13984]: I0312 12:35:28.783992 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:35:28.790514 master-0 kubenswrapper[13984]: I0312 12:35:28.790380 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:35:28.796777 master-0 kubenswrapper[13984]: I0312 12:35:28.796718 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 12 12:35:28.833830 master-0 kubenswrapper[13984]: I0312 12:35:28.833752 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-zkpl5" Mar 12 12:35:28.835919 master-0 kubenswrapper[13984]: I0312 12:35:28.835742 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 12 12:35:28.949000 master-0 kubenswrapper[13984]: I0312 12:35:28.948946 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 12 12:35:28.973709 master-0 kubenswrapper[13984]: I0312 12:35:28.973495 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 12 12:35:29.319280 master-0 kubenswrapper[13984]: I0312 12:35:29.319211 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-7vmf8" Mar 12 12:35:29.366757 master-0 kubenswrapper[13984]: I0312 12:35:29.366652 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 12 12:35:29.656622 master-0 kubenswrapper[13984]: I0312 12:35:29.656453 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 12 12:35:29.836797 master-0 kubenswrapper[13984]: I0312 12:35:29.836680 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 12 12:35:30.322641 master-0 kubenswrapper[13984]: I0312 12:35:30.322591 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 12 12:35:30.322641 master-0 kubenswrapper[13984]: I0312 12:35:30.322672 13984 generic.go:334] "Generic (PLEG): container finished" podID="899242a15b2bdf3b4a04fb323647ca94" containerID="b6f1cba7d9b8571df4e1cbef3aba679da425e257dfae8755a672b28c421b9453" exitCode=137 Mar 12 12:35:31.045591 master-0 kubenswrapper[13984]: I0312 12:35:31.045527 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 12 12:35:31.045785 master-0 kubenswrapper[13984]: I0312 12:35:31.045626 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:35:31.151460 master-0 kubenswrapper[13984]: I0312 12:35:31.151410 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 12 12:35:31.151460 master-0 kubenswrapper[13984]: I0312 12:35:31.151474 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 12 12:35:31.151460 master-0 kubenswrapper[13984]: I0312 12:35:31.151529 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 12 12:35:31.151460 master-0 kubenswrapper[13984]: I0312 12:35:31.151612 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 12 12:35:31.151460 master-0 kubenswrapper[13984]: I0312 12:35:31.151655 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 12 12:35:31.153095 master-0 kubenswrapper[13984]: I0312 12:35:31.153057 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log" (OuterVolumeSpecName: "var-log") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:31.153222 master-0 kubenswrapper[13984]: I0312 12:35:31.153111 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock" (OuterVolumeSpecName: "var-lock") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:31.153222 master-0 kubenswrapper[13984]: I0312 12:35:31.153152 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:31.153222 master-0 kubenswrapper[13984]: I0312 12:35:31.153189 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests" (OuterVolumeSpecName: "manifests") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:31.159098 master-0 kubenswrapper[13984]: I0312 12:35:31.159060 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:35:31.254141 master-0 kubenswrapper[13984]: I0312 12:35:31.253974 13984 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:31.254141 master-0 kubenswrapper[13984]: I0312 12:35:31.254047 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:31.254141 master-0 kubenswrapper[13984]: I0312 12:35:31.254057 13984 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:31.254141 master-0 kubenswrapper[13984]: I0312 12:35:31.254067 13984 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:31.254141 master-0 kubenswrapper[13984]: I0312 12:35:31.254076 13984 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:31.334151 master-0 kubenswrapper[13984]: I0312 12:35:31.334097 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 12 12:35:31.334753 master-0 kubenswrapper[13984]: I0312 12:35:31.334167 13984 scope.go:117] "RemoveContainer" containerID="b6f1cba7d9b8571df4e1cbef3aba679da425e257dfae8755a672b28c421b9453" Mar 12 12:35:31.334753 master-0 kubenswrapper[13984]: I0312 12:35:31.334264 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 12 12:35:31.990776 master-0 kubenswrapper[13984]: I0312 12:35:31.990681 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899242a15b2bdf3b4a04fb323647ca94" path="/var/lib/kubelet/pods/899242a15b2bdf3b4a04fb323647ca94/volumes" Mar 12 12:35:34.111411 master-0 kubenswrapper[13984]: I0312 12:35:34.111352 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6686f9695f-gkbkr"] Mar 12 12:35:34.155824 master-0 kubenswrapper[13984]: I0312 12:35:34.155773 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 12 12:35:34.442798 master-0 kubenswrapper[13984]: I0312 12:35:34.442739 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 12 12:35:35.680886 master-0 kubenswrapper[13984]: I0312 12:35:35.680825 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 12 12:35:35.813259 master-0 kubenswrapper[13984]: I0312 12:35:35.813202 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 12 12:35:36.239886 master-0 kubenswrapper[13984]: I0312 12:35:36.239829 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-fszq7" Mar 12 12:35:38.204951 master-0 kubenswrapper[13984]: I0312 12:35:38.204907 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 12 12:35:38.241718 master-0 kubenswrapper[13984]: I0312 12:35:38.241668 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 12 12:35:38.724113 master-0 kubenswrapper[13984]: I0312 12:35:38.724065 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 12 12:35:38.756728 master-0 kubenswrapper[13984]: I0312 12:35:38.756680 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rvcb2" Mar 12 12:35:38.836688 master-0 kubenswrapper[13984]: I0312 12:35:38.836630 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-zrwsh" Mar 12 12:35:38.967192 master-0 kubenswrapper[13984]: I0312 12:35:38.967149 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 12 12:35:39.205560 master-0 kubenswrapper[13984]: I0312 12:35:39.205458 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 12 12:35:39.209770 master-0 kubenswrapper[13984]: I0312 12:35:39.209726 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 12 12:35:40.428216 master-0 kubenswrapper[13984]: I0312 12:35:40.428158 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 12 12:35:40.602452 master-0 kubenswrapper[13984]: I0312 12:35:40.602391 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 12 12:35:40.898654 master-0 kubenswrapper[13984]: I0312 12:35:40.898610 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 12 12:35:41.114463 master-0 kubenswrapper[13984]: I0312 12:35:41.114399 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 12 12:35:41.683806 master-0 kubenswrapper[13984]: I0312 12:35:41.683686 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 12 12:35:41.689042 master-0 kubenswrapper[13984]: I0312 12:35:41.688995 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 12 12:35:42.113169 master-0 kubenswrapper[13984]: I0312 12:35:42.113129 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 12 12:35:42.139167 master-0 kubenswrapper[13984]: I0312 12:35:42.139040 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 12 12:35:42.838147 master-0 kubenswrapper[13984]: I0312 12:35:42.838076 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 12 12:35:44.195736 master-0 kubenswrapper[13984]: I0312 12:35:44.195681 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 12 12:35:59.144682 master-0 kubenswrapper[13984]: I0312 12:35:59.144608 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6686f9695f-gkbkr" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" containerID="cri-o://bcd65b1e4c21449072ed272f87ee781da0da9fc877eb7847799dae3f064bf2e7" gracePeriod=15 Mar 12 12:35:59.552156 master-0 kubenswrapper[13984]: I0312 12:35:59.552106 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6686f9695f-gkbkr_de8d0297-e472-45a4-9658-1d893e4c34aa/console/0.log" Mar 12 12:35:59.552308 master-0 kubenswrapper[13984]: I0312 12:35:59.552165 13984 generic.go:334] "Generic (PLEG): container finished" podID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerID="bcd65b1e4c21449072ed272f87ee781da0da9fc877eb7847799dae3f064bf2e7" exitCode=2 Mar 12 12:35:59.552308 master-0 kubenswrapper[13984]: I0312 12:35:59.552199 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6686f9695f-gkbkr" event={"ID":"de8d0297-e472-45a4-9658-1d893e4c34aa","Type":"ContainerDied","Data":"bcd65b1e4c21449072ed272f87ee781da0da9fc877eb7847799dae3f064bf2e7"} Mar 12 12:35:59.552308 master-0 kubenswrapper[13984]: I0312 12:35:59.552225 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6686f9695f-gkbkr" event={"ID":"de8d0297-e472-45a4-9658-1d893e4c34aa","Type":"ContainerDied","Data":"57dfe76efbf7deb3a53d129eabb1be70578bcf3fbe93a2dd467072510efe925e"} Mar 12 12:35:59.552308 master-0 kubenswrapper[13984]: I0312 12:35:59.552236 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57dfe76efbf7deb3a53d129eabb1be70578bcf3fbe93a2dd467072510efe925e" Mar 12 12:35:59.570138 master-0 kubenswrapper[13984]: I0312 12:35:59.570078 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6686f9695f-gkbkr_de8d0297-e472-45a4-9658-1d893e4c34aa/console/0.log" Mar 12 12:35:59.570267 master-0 kubenswrapper[13984]: I0312 12:35:59.570164 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:35:59.733214 master-0 kubenswrapper[13984]: I0312 12:35:59.733147 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-service-ca\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.733439 master-0 kubenswrapper[13984]: I0312 12:35:59.733284 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-oauth-serving-cert\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.733439 master-0 kubenswrapper[13984]: I0312 12:35:59.733317 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-serving-cert\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.733439 master-0 kubenswrapper[13984]: I0312 12:35:59.733345 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqczz\" (UniqueName: \"kubernetes.io/projected/de8d0297-e472-45a4-9658-1d893e4c34aa-kube-api-access-mqczz\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.733844 master-0 kubenswrapper[13984]: I0312 12:35:59.733785 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-service-ca" (OuterVolumeSpecName: "service-ca") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:35:59.733914 master-0 kubenswrapper[13984]: I0312 12:35:59.733844 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.733993 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-trusted-ca-bundle\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734115 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-oauth-config\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734166 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-console-config\") pod \"de8d0297-e472-45a4-9658-1d893e4c34aa\" (UID: \"de8d0297-e472-45a4-9658-1d893e4c34aa\") " Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734276 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734622 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-console-config" (OuterVolumeSpecName: "console-config") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734659 13984 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734689 13984 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.735796 master-0 kubenswrapper[13984]: I0312 12:35:59.734706 13984 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.737977 master-0 kubenswrapper[13984]: I0312 12:35:59.737924 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:35:59.738137 master-0 kubenswrapper[13984]: I0312 12:35:59.738084 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8d0297-e472-45a4-9658-1d893e4c34aa-kube-api-access-mqczz" (OuterVolumeSpecName: "kube-api-access-mqczz") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "kube-api-access-mqczz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:35:59.738405 master-0 kubenswrapper[13984]: I0312 12:35:59.738375 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de8d0297-e472-45a4-9658-1d893e4c34aa" (UID: "de8d0297-e472-45a4-9658-1d893e4c34aa"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:35:59.836146 master-0 kubenswrapper[13984]: I0312 12:35:59.835998 13984 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.836146 master-0 kubenswrapper[13984]: I0312 12:35:59.836035 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqczz\" (UniqueName: \"kubernetes.io/projected/de8d0297-e472-45a4-9658-1d893e4c34aa-kube-api-access-mqczz\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.836146 master-0 kubenswrapper[13984]: I0312 12:35:59.836045 13984 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d0297-e472-45a4-9658-1d893e4c34aa-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.836146 master-0 kubenswrapper[13984]: I0312 12:35:59.836064 13984 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d0297-e472-45a4-9658-1d893e4c34aa-console-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:35:59.836654 master-0 kubenswrapper[13984]: I0312 12:35:59.836619 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-4tstb"] Mar 12 12:35:59.836907 master-0 kubenswrapper[13984]: E0312 12:35:59.836871 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 12 12:35:59.836907 master-0 kubenswrapper[13984]: I0312 12:35:59.836890 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 12 12:35:59.836907 master-0 kubenswrapper[13984]: E0312 12:35:59.836905 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" Mar 12 12:35:59.837103 master-0 kubenswrapper[13984]: I0312 12:35:59.836913 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" Mar 12 12:35:59.837103 master-0 kubenswrapper[13984]: E0312 12:35:59.836941 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" containerName="installer" Mar 12 12:35:59.837103 master-0 kubenswrapper[13984]: I0312 12:35:59.836950 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" containerName="installer" Mar 12 12:35:59.837283 master-0 kubenswrapper[13984]: I0312 12:35:59.837113 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a40de76a-de63-4e37-b1ab-b7fe767e67ea" containerName="installer" Mar 12 12:35:59.837283 master-0 kubenswrapper[13984]: I0312 12:35:59.837128 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" containerName="console" Mar 12 12:35:59.837283 master-0 kubenswrapper[13984]: I0312 12:35:59.837140 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 12 12:35:59.837746 master-0 kubenswrapper[13984]: I0312 12:35:59.837708 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:35:59.839918 master-0 kubenswrapper[13984]: I0312 12:35:59.839891 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 12 12:35:59.840091 master-0 kubenswrapper[13984]: I0312 12:35:59.840067 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 12 12:35:59.848392 master-0 kubenswrapper[13984]: I0312 12:35:59.848338 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-4tstb"] Mar 12 12:35:59.939687 master-0 kubenswrapper[13984]: I0312 12:35:59.937972 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b960b7c6-d2d2-431f-9ab6-3a5a228b9189-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-4tstb\" (UID: \"b960b7c6-d2d2-431f-9ab6-3a5a228b9189\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:35:59.939687 master-0 kubenswrapper[13984]: I0312 12:35:59.938088 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b960b7c6-d2d2-431f-9ab6-3a5a228b9189-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-4tstb\" (UID: \"b960b7c6-d2d2-431f-9ab6-3a5a228b9189\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:36:00.039434 master-0 kubenswrapper[13984]: I0312 12:36:00.039326 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b960b7c6-d2d2-431f-9ab6-3a5a228b9189-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-4tstb\" (UID: \"b960b7c6-d2d2-431f-9ab6-3a5a228b9189\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:36:00.039434 master-0 kubenswrapper[13984]: I0312 12:36:00.039416 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b960b7c6-d2d2-431f-9ab6-3a5a228b9189-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-4tstb\" (UID: \"b960b7c6-d2d2-431f-9ab6-3a5a228b9189\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:36:00.040517 master-0 kubenswrapper[13984]: I0312 12:36:00.040450 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/b960b7c6-d2d2-431f-9ab6-3a5a228b9189-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-4tstb\" (UID: \"b960b7c6-d2d2-431f-9ab6-3a5a228b9189\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:36:00.042649 master-0 kubenswrapper[13984]: I0312 12:36:00.042606 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/b960b7c6-d2d2-431f-9ab6-3a5a228b9189-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-4tstb\" (UID: \"b960b7c6-d2d2-431f-9ab6-3a5a228b9189\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:36:00.171274 master-0 kubenswrapper[13984]: I0312 12:36:00.171110 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" Mar 12 12:36:00.560777 master-0 kubenswrapper[13984]: I0312 12:36:00.560676 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6686f9695f-gkbkr" Mar 12 12:36:00.575224 master-0 kubenswrapper[13984]: I0312 12:36:00.574886 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-4tstb"] Mar 12 12:36:00.582324 master-0 kubenswrapper[13984]: W0312 12:36:00.582282 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb960b7c6_d2d2_431f_9ab6_3a5a228b9189.slice/crio-5a0bf26af7f81800036b1f1f4ca1c1cecae61b5653f0f770e4ecc4ef6df76e22 WatchSource:0}: Error finding container 5a0bf26af7f81800036b1f1f4ca1c1cecae61b5653f0f770e4ecc4ef6df76e22: Status 404 returned error can't find the container with id 5a0bf26af7f81800036b1f1f4ca1c1cecae61b5653f0f770e4ecc4ef6df76e22 Mar 12 12:36:00.591695 master-0 kubenswrapper[13984]: I0312 12:36:00.591643 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6686f9695f-gkbkr"] Mar 12 12:36:00.597487 master-0 kubenswrapper[13984]: I0312 12:36:00.597439 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6686f9695f-gkbkr"] Mar 12 12:36:01.577044 master-0 kubenswrapper[13984]: I0312 12:36:01.576981 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" event={"ID":"b960b7c6-d2d2-431f-9ab6-3a5a228b9189","Type":"ContainerStarted","Data":"5a0bf26af7f81800036b1f1f4ca1c1cecae61b5653f0f770e4ecc4ef6df76e22"} Mar 12 12:36:01.988699 master-0 kubenswrapper[13984]: I0312 12:36:01.988638 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8d0297-e472-45a4-9658-1d893e4c34aa" path="/var/lib/kubelet/pods/de8d0297-e472-45a4-9658-1d893e4c34aa/volumes" Mar 12 12:36:02.584718 master-0 kubenswrapper[13984]: I0312 12:36:02.584630 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" event={"ID":"b960b7c6-d2d2-431f-9ab6-3a5a228b9189","Type":"ContainerStarted","Data":"d8a63324794459851013a9a7038ab8d03ed0d2fd4c6c4e0a7a120633eaccc78f"} Mar 12 12:36:02.603158 master-0 kubenswrapper[13984]: I0312 12:36:02.603079 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-4tstb" podStartSLOduration=2.295118549 podStartE2EDuration="3.603061622s" podCreationTimestamp="2026-03-12 12:35:59 +0000 UTC" firstStartedPulling="2026-03-12 12:36:00.587931563 +0000 UTC m=+692.785947055" lastFinishedPulling="2026-03-12 12:36:01.895874626 +0000 UTC m=+694.093890128" observedRunningTime="2026-03-12 12:36:02.599158672 +0000 UTC m=+694.797174184" watchObservedRunningTime="2026-03-12 12:36:02.603061622 +0000 UTC m=+694.801077114" Mar 12 12:36:26.088578 master-0 kubenswrapper[13984]: E0312 12:36:26.088528 13984 configmap.go:193] Couldn't get configMap openshift-monitoring/prometheus-k8s-rulefiles-0: configmap "prometheus-k8s-rulefiles-0" not found Mar 12 12:36:26.089325 master-0 kubenswrapper[13984]: E0312 12:36:26.089306 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-rulefiles-0 podName:e5e27787-de34-48dd-8854-79387c59fa6c nodeName:}" failed. No retries permitted until 2026-03-12 12:36:26.589286514 +0000 UTC m=+718.787302006 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-k8s-rulefiles-0" (UniqueName: "kubernetes.io/configmap/e5e27787-de34-48dd-8854-79387c59fa6c-prometheus-k8s-rulefiles-0") pod "prometheus-k8s-0" (UID: "e5e27787-de34-48dd-8854-79387c59fa6c") : configmap "prometheus-k8s-rulefiles-0" not found Mar 12 12:37:15.235850 master-0 kubenswrapper[13984]: I0312 12:37:15.235783 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7f76bf8d9-qghrl"] Mar 12 12:37:15.236862 master-0 kubenswrapper[13984]: I0312 12:37:15.236823 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.262993 master-0 kubenswrapper[13984]: I0312 12:37:15.262900 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f76bf8d9-qghrl"] Mar 12 12:37:15.278503 master-0 kubenswrapper[13984]: I0312 12:37:15.278436 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-oauth-config\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.278776 master-0 kubenswrapper[13984]: I0312 12:37:15.278535 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-service-ca\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.278776 master-0 kubenswrapper[13984]: I0312 12:37:15.278615 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-trusted-ca-bundle\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.278776 master-0 kubenswrapper[13984]: I0312 12:37:15.278652 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-serving-cert\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.278776 master-0 kubenswrapper[13984]: I0312 12:37:15.278683 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvj8h\" (UniqueName: \"kubernetes.io/projected/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-kube-api-access-kvj8h\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.278986 master-0 kubenswrapper[13984]: I0312 12:37:15.278809 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-oauth-serving-cert\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.278986 master-0 kubenswrapper[13984]: I0312 12:37:15.278827 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-config\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.380955 master-0 kubenswrapper[13984]: I0312 12:37:15.380867 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-oauth-serving-cert\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.380955 master-0 kubenswrapper[13984]: I0312 12:37:15.380946 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-config\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.381208 master-0 kubenswrapper[13984]: I0312 12:37:15.380987 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-oauth-config\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.381208 master-0 kubenswrapper[13984]: I0312 12:37:15.381009 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-service-ca\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.381208 master-0 kubenswrapper[13984]: I0312 12:37:15.381039 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-trusted-ca-bundle\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.381208 master-0 kubenswrapper[13984]: I0312 12:37:15.381062 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-serving-cert\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.381208 master-0 kubenswrapper[13984]: I0312 12:37:15.381088 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvj8h\" (UniqueName: \"kubernetes.io/projected/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-kube-api-access-kvj8h\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.382189 master-0 kubenswrapper[13984]: I0312 12:37:15.382154 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-oauth-serving-cert\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.382772 master-0 kubenswrapper[13984]: I0312 12:37:15.382739 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-config\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.383850 master-0 kubenswrapper[13984]: I0312 12:37:15.383825 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-service-ca\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.384289 master-0 kubenswrapper[13984]: I0312 12:37:15.384260 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-trusted-ca-bundle\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.385931 master-0 kubenswrapper[13984]: I0312 12:37:15.385897 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-oauth-config\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.386413 master-0 kubenswrapper[13984]: I0312 12:37:15.386373 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-serving-cert\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.395141 master-0 kubenswrapper[13984]: I0312 12:37:15.395105 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvj8h\" (UniqueName: \"kubernetes.io/projected/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-kube-api-access-kvj8h\") pod \"console-7f76bf8d9-qghrl\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.560741 master-0 kubenswrapper[13984]: I0312 12:37:15.560597 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:15.946115 master-0 kubenswrapper[13984]: I0312 12:37:15.946038 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7f76bf8d9-qghrl"] Mar 12 12:37:16.152040 master-0 kubenswrapper[13984]: I0312 12:37:16.151975 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76bf8d9-qghrl" event={"ID":"5cfd45cf-2239-49a1-887b-e1115fbc5fe3","Type":"ContainerStarted","Data":"6b66236ced309d9d764e23b8349415b0dfc74f95cf6030e11271a84a28a86fb1"} Mar 12 12:37:16.152040 master-0 kubenswrapper[13984]: I0312 12:37:16.152024 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76bf8d9-qghrl" event={"ID":"5cfd45cf-2239-49a1-887b-e1115fbc5fe3","Type":"ContainerStarted","Data":"f742dbc9ecc16f1b8a5bab49d77045dd6343b71a802a443cc30146d026e4e431"} Mar 12 12:37:16.171032 master-0 kubenswrapper[13984]: I0312 12:37:16.170966 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7f76bf8d9-qghrl" podStartSLOduration=1.170948458 podStartE2EDuration="1.170948458s" podCreationTimestamp="2026-03-12 12:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:37:16.167341126 +0000 UTC m=+768.365356618" watchObservedRunningTime="2026-03-12 12:37:16.170948458 +0000 UTC m=+768.368963950" Mar 12 12:37:24.926764 master-0 kubenswrapper[13984]: I0312 12:37:24.926632 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-p6ggm"] Mar 12 12:37:24.928308 master-0 kubenswrapper[13984]: I0312 12:37:24.928268 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:24.931503 master-0 kubenswrapper[13984]: I0312 12:37:24.931404 13984 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 12 12:37:24.931989 master-0 kubenswrapper[13984]: I0312 12:37:24.931951 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 12 12:37:24.936601 master-0 kubenswrapper[13984]: I0312 12:37:24.936566 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 12 12:37:24.937044 master-0 kubenswrapper[13984]: I0312 12:37:24.937020 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 12 12:37:24.952016 master-0 kubenswrapper[13984]: I0312 12:37:24.951925 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-p6ggm"] Mar 12 12:37:25.035639 master-0 kubenswrapper[13984]: I0312 12:37:25.035555 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7p8qh\" (UniqueName: \"kubernetes.io/projected/09acc31e-703b-4124-a43d-ee8d2ff94d96-kube-api-access-7p8qh\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.035921 master-0 kubenswrapper[13984]: I0312 12:37:25.035800 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/09acc31e-703b-4124-a43d-ee8d2ff94d96-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.035921 master-0 kubenswrapper[13984]: I0312 12:37:25.035910 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/09acc31e-703b-4124-a43d-ee8d2ff94d96-os-client-config\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.137825 master-0 kubenswrapper[13984]: I0312 12:37:25.137778 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7p8qh\" (UniqueName: \"kubernetes.io/projected/09acc31e-703b-4124-a43d-ee8d2ff94d96-kube-api-access-7p8qh\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.138214 master-0 kubenswrapper[13984]: I0312 12:37:25.138190 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/09acc31e-703b-4124-a43d-ee8d2ff94d96-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.138349 master-0 kubenswrapper[13984]: I0312 12:37:25.138330 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/09acc31e-703b-4124-a43d-ee8d2ff94d96-os-client-config\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.139255 master-0 kubenswrapper[13984]: I0312 12:37:25.139222 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/09acc31e-703b-4124-a43d-ee8d2ff94d96-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.142252 master-0 kubenswrapper[13984]: I0312 12:37:25.142215 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/09acc31e-703b-4124-a43d-ee8d2ff94d96-os-client-config\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.153717 master-0 kubenswrapper[13984]: I0312 12:37:25.153677 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7p8qh\" (UniqueName: \"kubernetes.io/projected/09acc31e-703b-4124-a43d-ee8d2ff94d96-kube-api-access-7p8qh\") pod \"sushy-emulator-78f6d7d749-p6ggm\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.266816 master-0 kubenswrapper[13984]: I0312 12:37:25.266750 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:25.562244 master-0 kubenswrapper[13984]: I0312 12:37:25.562065 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:25.562244 master-0 kubenswrapper[13984]: I0312 12:37:25.562174 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:25.566789 master-0 kubenswrapper[13984]: I0312 12:37:25.566738 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:25.714395 master-0 kubenswrapper[13984]: I0312 12:37:25.714332 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-p6ggm"] Mar 12 12:37:26.395285 master-0 kubenswrapper[13984]: I0312 12:37:26.395210 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" event={"ID":"09acc31e-703b-4124-a43d-ee8d2ff94d96","Type":"ContainerStarted","Data":"29ca8b52a193b372234a4b1e4080ac30bd76569ed3c77f5736a667a923ef5d6b"} Mar 12 12:37:26.398178 master-0 kubenswrapper[13984]: I0312 12:37:26.398131 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:37:27.236419 master-0 kubenswrapper[13984]: I0312 12:37:27.236352 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79db4dd985-zqfm7"] Mar 12 12:37:35.467705 master-0 kubenswrapper[13984]: I0312 12:37:35.467582 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" event={"ID":"09acc31e-703b-4124-a43d-ee8d2ff94d96","Type":"ContainerStarted","Data":"c70de9c4e88d3c97ad7ad479acb9a26b173cf5b25c3c9f3a396e702a946cd740"} Mar 12 12:37:35.494422 master-0 kubenswrapper[13984]: I0312 12:37:35.494260 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" podStartSLOduration=2.015752389 podStartE2EDuration="11.494228589s" podCreationTimestamp="2026-03-12 12:37:24 +0000 UTC" firstStartedPulling="2026-03-12 12:37:25.716849421 +0000 UTC m=+777.914864913" lastFinishedPulling="2026-03-12 12:37:35.195325621 +0000 UTC m=+787.393341113" observedRunningTime="2026-03-12 12:37:35.4893252 +0000 UTC m=+787.687340712" watchObservedRunningTime="2026-03-12 12:37:35.494228589 +0000 UTC m=+787.692244081" Mar 12 12:37:45.267249 master-0 kubenswrapper[13984]: I0312 12:37:45.267166 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:45.267249 master-0 kubenswrapper[13984]: I0312 12:37:45.267240 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:45.277677 master-0 kubenswrapper[13984]: I0312 12:37:45.277591 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:45.568863 master-0 kubenswrapper[13984]: I0312 12:37:45.568746 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:37:52.297938 master-0 kubenswrapper[13984]: I0312 12:37:52.297828 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-79db4dd985-zqfm7" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" containerID="cri-o://4cad2d72886f2435dbe577ee337f88f9af220f6e90117513eb2892126e93c01f" gracePeriod=15 Mar 12 12:37:52.626416 master-0 kubenswrapper[13984]: I0312 12:37:52.626362 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79db4dd985-zqfm7_0485a15a-a4a4-4df7-a74c-d7379f4d01cb/console/0.log" Mar 12 12:37:52.626416 master-0 kubenswrapper[13984]: I0312 12:37:52.626414 13984 generic.go:334] "Generic (PLEG): container finished" podID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerID="4cad2d72886f2435dbe577ee337f88f9af220f6e90117513eb2892126e93c01f" exitCode=2 Mar 12 12:37:52.626676 master-0 kubenswrapper[13984]: I0312 12:37:52.626446 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79db4dd985-zqfm7" event={"ID":"0485a15a-a4a4-4df7-a74c-d7379f4d01cb","Type":"ContainerDied","Data":"4cad2d72886f2435dbe577ee337f88f9af220f6e90117513eb2892126e93c01f"} Mar 12 12:37:52.779457 master-0 kubenswrapper[13984]: I0312 12:37:52.779401 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79db4dd985-zqfm7_0485a15a-a4a4-4df7-a74c-d7379f4d01cb/console/0.log" Mar 12 12:37:52.779457 master-0 kubenswrapper[13984]: I0312 12:37:52.779470 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:37:52.809328 master-0 kubenswrapper[13984]: I0312 12:37:52.809224 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-serving-cert\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.809328 master-0 kubenswrapper[13984]: I0312 12:37:52.809313 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-oauth-serving-cert\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.809640 master-0 kubenswrapper[13984]: I0312 12:37:52.809367 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjf7c\" (UniqueName: \"kubernetes.io/projected/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-kube-api-access-vjf7c\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.809640 master-0 kubenswrapper[13984]: I0312 12:37:52.809398 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-service-ca\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.809640 master-0 kubenswrapper[13984]: I0312 12:37:52.809428 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-oauth-config\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.810342 master-0 kubenswrapper[13984]: I0312 12:37:52.809910 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-config\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.810342 master-0 kubenswrapper[13984]: I0312 12:37:52.810007 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:37:52.810342 master-0 kubenswrapper[13984]: I0312 12:37:52.810045 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-trusted-ca-bundle\") pod \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\" (UID: \"0485a15a-a4a4-4df7-a74c-d7379f4d01cb\") " Mar 12 12:37:52.810640 master-0 kubenswrapper[13984]: I0312 12:37:52.810553 13984 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:52.810640 master-0 kubenswrapper[13984]: I0312 12:37:52.810608 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-config" (OuterVolumeSpecName: "console-config") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:37:52.810805 master-0 kubenswrapper[13984]: I0312 12:37:52.810729 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-service-ca" (OuterVolumeSpecName: "service-ca") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:37:52.812191 master-0 kubenswrapper[13984]: I0312 12:37:52.811020 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:37:52.817302 master-0 kubenswrapper[13984]: I0312 12:37:52.817267 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:37:52.817686 master-0 kubenswrapper[13984]: I0312 12:37:52.817628 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-kube-api-access-vjf7c" (OuterVolumeSpecName: "kube-api-access-vjf7c") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "kube-api-access-vjf7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:37:52.820192 master-0 kubenswrapper[13984]: I0312 12:37:52.820151 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0485a15a-a4a4-4df7-a74c-d7379f4d01cb" (UID: "0485a15a-a4a4-4df7-a74c-d7379f4d01cb"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:37:52.912700 master-0 kubenswrapper[13984]: I0312 12:37:52.912457 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vjf7c\" (UniqueName: \"kubernetes.io/projected/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-kube-api-access-vjf7c\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:52.912700 master-0 kubenswrapper[13984]: I0312 12:37:52.912533 13984 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:52.912700 master-0 kubenswrapper[13984]: I0312 12:37:52.912553 13984 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:52.912700 master-0 kubenswrapper[13984]: I0312 12:37:52.912571 13984 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:52.912700 master-0 kubenswrapper[13984]: I0312 12:37:52.912587 13984 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:52.912700 master-0 kubenswrapper[13984]: I0312 12:37:52.912605 13984 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0485a15a-a4a4-4df7-a74c-d7379f4d01cb-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:37:53.636047 master-0 kubenswrapper[13984]: I0312 12:37:53.635982 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79db4dd985-zqfm7_0485a15a-a4a4-4df7-a74c-d7379f4d01cb/console/0.log" Mar 12 12:37:53.636047 master-0 kubenswrapper[13984]: I0312 12:37:53.636047 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79db4dd985-zqfm7" event={"ID":"0485a15a-a4a4-4df7-a74c-d7379f4d01cb","Type":"ContainerDied","Data":"25efc1af59a3bbaf7ecf95082004d95cdc92d21a5c4ab5b3028b6330b943e192"} Mar 12 12:37:53.637002 master-0 kubenswrapper[13984]: I0312 12:37:53.636081 13984 scope.go:117] "RemoveContainer" containerID="4cad2d72886f2435dbe577ee337f88f9af220f6e90117513eb2892126e93c01f" Mar 12 12:37:53.637002 master-0 kubenswrapper[13984]: I0312 12:37:53.636221 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79db4dd985-zqfm7" Mar 12 12:37:53.682782 master-0 kubenswrapper[13984]: I0312 12:37:53.682720 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-79db4dd985-zqfm7"] Mar 12 12:37:53.687619 master-0 kubenswrapper[13984]: I0312 12:37:53.687586 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-79db4dd985-zqfm7"] Mar 12 12:37:53.991955 master-0 kubenswrapper[13984]: I0312 12:37:53.991859 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" path="/var/lib/kubelet/pods/0485a15a-a4a4-4df7-a74c-d7379f4d01cb/volumes" Mar 12 12:37:55.696441 master-0 kubenswrapper[13984]: I0312 12:37:55.696371 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 12 12:37:55.697116 master-0 kubenswrapper[13984]: E0312 12:37:55.696691 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" Mar 12 12:37:55.697116 master-0 kubenswrapper[13984]: I0312 12:37:55.696705 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" Mar 12 12:37:55.697116 master-0 kubenswrapper[13984]: I0312 12:37:55.696852 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="0485a15a-a4a4-4df7-a74c-d7379f4d01cb" containerName="console" Mar 12 12:37:55.697448 master-0 kubenswrapper[13984]: I0312 12:37:55.697417 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.700419 master-0 kubenswrapper[13984]: I0312 12:37:55.700365 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-rwh96" Mar 12 12:37:55.700683 master-0 kubenswrapper[13984]: I0312 12:37:55.700624 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 12 12:37:55.718616 master-0 kubenswrapper[13984]: I0312 12:37:55.718535 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 12 12:37:55.855638 master-0 kubenswrapper[13984]: I0312 12:37:55.855355 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36c3b10a-4214-4e96-aea7-b424049bdda5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.855638 master-0 kubenswrapper[13984]: I0312 12:37:55.855534 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.855638 master-0 kubenswrapper[13984]: I0312 12:37:55.855566 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-var-lock\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.957676 master-0 kubenswrapper[13984]: I0312 12:37:55.957516 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.957676 master-0 kubenswrapper[13984]: I0312 12:37:55.957578 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-var-lock\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.957676 master-0 kubenswrapper[13984]: I0312 12:37:55.957646 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36c3b10a-4214-4e96-aea7-b424049bdda5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.958217 master-0 kubenswrapper[13984]: I0312 12:37:55.957654 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.958217 master-0 kubenswrapper[13984]: I0312 12:37:55.957752 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-var-lock\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:55.973314 master-0 kubenswrapper[13984]: I0312 12:37:55.973254 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36c3b10a-4214-4e96-aea7-b424049bdda5-kube-api-access\") pod \"installer-4-master-0\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:56.018630 master-0 kubenswrapper[13984]: I0312 12:37:56.018522 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:37:56.424635 master-0 kubenswrapper[13984]: I0312 12:37:56.424591 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 12 12:37:56.429757 master-0 kubenswrapper[13984]: W0312 12:37:56.429706 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod36c3b10a_4214_4e96_aea7_b424049bdda5.slice/crio-4b871d2e92b012c12465516a512c615631a66a8c377af58558cfd25e92a1e1ff WatchSource:0}: Error finding container 4b871d2e92b012c12465516a512c615631a66a8c377af58558cfd25e92a1e1ff: Status 404 returned error can't find the container with id 4b871d2e92b012c12465516a512c615631a66a8c377af58558cfd25e92a1e1ff Mar 12 12:37:56.664802 master-0 kubenswrapper[13984]: I0312 12:37:56.664730 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"36c3b10a-4214-4e96-aea7-b424049bdda5","Type":"ContainerStarted","Data":"4b871d2e92b012c12465516a512c615631a66a8c377af58558cfd25e92a1e1ff"} Mar 12 12:37:57.679006 master-0 kubenswrapper[13984]: I0312 12:37:57.678913 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"36c3b10a-4214-4e96-aea7-b424049bdda5","Type":"ContainerStarted","Data":"fd15b5c1ee5ef8b0a96b118f360d446c896b1f616aa0722682a35f941b565c57"} Mar 12 12:37:57.711428 master-0 kubenswrapper[13984]: I0312 12:37:57.711314 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.711289851 podStartE2EDuration="2.711289851s" podCreationTimestamp="2026-03-12 12:37:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:37:57.702658907 +0000 UTC m=+809.900674419" watchObservedRunningTime="2026-03-12 12:37:57.711289851 +0000 UTC m=+809.909305363" Mar 12 12:38:05.263803 master-0 kubenswrapper[13984]: I0312 12:38:05.263717 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-6787b5d975-rzbv4"] Mar 12 12:38:05.266250 master-0 kubenswrapper[13984]: I0312 12:38:05.266202 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.286297 master-0 kubenswrapper[13984]: I0312 12:38:05.286209 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-6787b5d975-rzbv4"] Mar 12 12:38:05.304499 master-0 kubenswrapper[13984]: I0312 12:38:05.304197 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f4eeac9d-69cc-492c-a662-7b6d6289cb86-os-client-config\") pod \"nova-console-poller-6787b5d975-rzbv4\" (UID: \"f4eeac9d-69cc-492c-a662-7b6d6289cb86\") " pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.304499 master-0 kubenswrapper[13984]: I0312 12:38:05.304385 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hrg9\" (UniqueName: \"kubernetes.io/projected/f4eeac9d-69cc-492c-a662-7b6d6289cb86-kube-api-access-6hrg9\") pod \"nova-console-poller-6787b5d975-rzbv4\" (UID: \"f4eeac9d-69cc-492c-a662-7b6d6289cb86\") " pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.406051 master-0 kubenswrapper[13984]: I0312 12:38:05.405984 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f4eeac9d-69cc-492c-a662-7b6d6289cb86-os-client-config\") pod \"nova-console-poller-6787b5d975-rzbv4\" (UID: \"f4eeac9d-69cc-492c-a662-7b6d6289cb86\") " pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.406288 master-0 kubenswrapper[13984]: I0312 12:38:05.406063 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hrg9\" (UniqueName: \"kubernetes.io/projected/f4eeac9d-69cc-492c-a662-7b6d6289cb86-kube-api-access-6hrg9\") pod \"nova-console-poller-6787b5d975-rzbv4\" (UID: \"f4eeac9d-69cc-492c-a662-7b6d6289cb86\") " pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.409027 master-0 kubenswrapper[13984]: I0312 12:38:05.408988 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/f4eeac9d-69cc-492c-a662-7b6d6289cb86-os-client-config\") pod \"nova-console-poller-6787b5d975-rzbv4\" (UID: \"f4eeac9d-69cc-492c-a662-7b6d6289cb86\") " pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.423695 master-0 kubenswrapper[13984]: I0312 12:38:05.422827 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hrg9\" (UniqueName: \"kubernetes.io/projected/f4eeac9d-69cc-492c-a662-7b6d6289cb86-kube-api-access-6hrg9\") pod \"nova-console-poller-6787b5d975-rzbv4\" (UID: \"f4eeac9d-69cc-492c-a662-7b6d6289cb86\") " pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:05.584298 master-0 kubenswrapper[13984]: I0312 12:38:05.584174 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" Mar 12 12:38:06.063642 master-0 kubenswrapper[13984]: I0312 12:38:06.063573 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-6787b5d975-rzbv4"] Mar 12 12:38:06.072555 master-0 kubenswrapper[13984]: I0312 12:38:06.072438 13984 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:38:06.778927 master-0 kubenswrapper[13984]: I0312 12:38:06.778832 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" event={"ID":"f4eeac9d-69cc-492c-a662-7b6d6289cb86","Type":"ContainerStarted","Data":"b0bb3e5d26f0b72f27c31a320403f3fd510ef03370ae051ea044666e89e9a0be"} Mar 12 12:38:12.825829 master-0 kubenswrapper[13984]: I0312 12:38:12.825699 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" event={"ID":"f4eeac9d-69cc-492c-a662-7b6d6289cb86","Type":"ContainerStarted","Data":"31945cdc3898da0345825adec0d003b1ed6f3ca9cb00cbe99fc8044b1116f8d6"} Mar 12 12:38:13.838751 master-0 kubenswrapper[13984]: I0312 12:38:13.838649 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" event={"ID":"f4eeac9d-69cc-492c-a662-7b6d6289cb86","Type":"ContainerStarted","Data":"b0041b6902c1903fc0eca5856dda3c365fabf80c54b8a0b00f57a0f0352c4524"} Mar 12 12:38:13.863555 master-0 kubenswrapper[13984]: I0312 12:38:13.863426 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-6787b5d975-rzbv4" podStartSLOduration=1.569085976 podStartE2EDuration="8.863400154s" podCreationTimestamp="2026-03-12 12:38:05 +0000 UTC" firstStartedPulling="2026-03-12 12:38:06.072341866 +0000 UTC m=+818.270357378" lastFinishedPulling="2026-03-12 12:38:13.366656054 +0000 UTC m=+825.564671556" observedRunningTime="2026-03-12 12:38:13.85867289 +0000 UTC m=+826.056688412" watchObservedRunningTime="2026-03-12 12:38:13.863400154 +0000 UTC m=+826.061415646" Mar 12 12:38:28.649239 master-0 kubenswrapper[13984]: I0312 12:38:28.649147 13984 scope.go:117] "RemoveContainer" containerID="545ce6883090e4ecdcea7622975d3e1147cda32524b819153cd3b1df99e6ad16" Mar 12 12:38:28.672221 master-0 kubenswrapper[13984]: I0312 12:38:28.672147 13984 scope.go:117] "RemoveContainer" containerID="37ce22b830e08d23df66196c50da46b1d74b7f16a0a3129542989b5d4bdc3dd3" Mar 12 12:38:29.406090 master-0 kubenswrapper[13984]: I0312 12:38:29.405987 13984 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:38:29.406713 master-0 kubenswrapper[13984]: I0312 12:38:29.406594 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="cluster-policy-controller" containerID="cri-o://5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" gracePeriod=30 Mar 12 12:38:29.406809 master-0 kubenswrapper[13984]: I0312 12:38:29.406686 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" gracePeriod=30 Mar 12 12:38:29.406809 master-0 kubenswrapper[13984]: I0312 12:38:29.406739 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" containerID="cri-o://e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" gracePeriod=30 Mar 12 12:38:29.406809 master-0 kubenswrapper[13984]: I0312 12:38:29.406687 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" gracePeriod=30 Mar 12 12:38:29.407288 master-0 kubenswrapper[13984]: I0312 12:38:29.407254 13984 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:38:29.407625 master-0 kubenswrapper[13984]: E0312 12:38:29.407587 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" Mar 12 12:38:29.407625 master-0 kubenswrapper[13984]: I0312 12:38:29.407609 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" Mar 12 12:38:29.407625 master-0 kubenswrapper[13984]: E0312 12:38:29.407625 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-recovery-controller" Mar 12 12:38:29.407844 master-0 kubenswrapper[13984]: I0312 12:38:29.407635 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-recovery-controller" Mar 12 12:38:29.407844 master-0 kubenswrapper[13984]: E0312 12:38:29.407658 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-cert-syncer" Mar 12 12:38:29.407844 master-0 kubenswrapper[13984]: I0312 12:38:29.407667 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-cert-syncer" Mar 12 12:38:29.407844 master-0 kubenswrapper[13984]: E0312 12:38:29.407689 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="cluster-policy-controller" Mar 12 12:38:29.407844 master-0 kubenswrapper[13984]: I0312 12:38:29.407697 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="cluster-policy-controller" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: I0312 12:38:29.407851 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-recovery-controller" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: I0312 12:38:29.407878 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: I0312 12:38:29.407894 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: I0312 12:38:29.407903 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager-cert-syncer" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: I0312 12:38:29.407922 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="cluster-policy-controller" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: E0312 12:38:29.408086 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" Mar 12 12:38:29.408134 master-0 kubenswrapper[13984]: I0312 12:38:29.408097 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a6a89fd8fe0c6f59a2124101057324" containerName="kube-controller-manager" Mar 12 12:38:29.442886 master-0 kubenswrapper[13984]: I0312 12:38:29.442819 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5976cfb1534d8e5b07a26777aadb94a1-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5976cfb1534d8e5b07a26777aadb94a1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.443047 master-0 kubenswrapper[13984]: I0312 12:38:29.442935 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5976cfb1534d8e5b07a26777aadb94a1-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5976cfb1534d8e5b07a26777aadb94a1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.544279 master-0 kubenswrapper[13984]: I0312 12:38:29.544153 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5976cfb1534d8e5b07a26777aadb94a1-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5976cfb1534d8e5b07a26777aadb94a1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.544384 master-0 kubenswrapper[13984]: I0312 12:38:29.544314 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/5976cfb1534d8e5b07a26777aadb94a1-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5976cfb1534d8e5b07a26777aadb94a1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.544384 master-0 kubenswrapper[13984]: I0312 12:38:29.544371 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5976cfb1534d8e5b07a26777aadb94a1-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5976cfb1534d8e5b07a26777aadb94a1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.544505 master-0 kubenswrapper[13984]: I0312 12:38:29.544462 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/5976cfb1534d8e5b07a26777aadb94a1-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"5976cfb1534d8e5b07a26777aadb94a1\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.586593 master-0 kubenswrapper[13984]: I0312 12:38:29.586473 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d2a6a89fd8fe0c6f59a2124101057324/kube-controller-manager-cert-syncer/0.log" Mar 12 12:38:29.587796 master-0 kubenswrapper[13984]: I0312 12:38:29.587761 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d2a6a89fd8fe0c6f59a2124101057324/kube-controller-manager/0.log" Mar 12 12:38:29.587903 master-0 kubenswrapper[13984]: I0312 12:38:29.587873 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.592959 master-0 kubenswrapper[13984]: I0312 12:38:29.592900 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="d2a6a89fd8fe0c6f59a2124101057324" podUID="5976cfb1534d8e5b07a26777aadb94a1" Mar 12 12:38:29.645254 master-0 kubenswrapper[13984]: I0312 12:38:29.645191 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-cert-dir\") pod \"d2a6a89fd8fe0c6f59a2124101057324\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " Mar 12 12:38:29.645456 master-0 kubenswrapper[13984]: I0312 12:38:29.645319 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "d2a6a89fd8fe0c6f59a2124101057324" (UID: "d2a6a89fd8fe0c6f59a2124101057324"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:38:29.645456 master-0 kubenswrapper[13984]: I0312 12:38:29.645408 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-resource-dir\") pod \"d2a6a89fd8fe0c6f59a2124101057324\" (UID: \"d2a6a89fd8fe0c6f59a2124101057324\") " Mar 12 12:38:29.645553 master-0 kubenswrapper[13984]: I0312 12:38:29.645517 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "d2a6a89fd8fe0c6f59a2124101057324" (UID: "d2a6a89fd8fe0c6f59a2124101057324"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:38:29.645796 master-0 kubenswrapper[13984]: I0312 12:38:29.645767 13984 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:38:29.645796 master-0 kubenswrapper[13984]: I0312 12:38:29.645792 13984 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d2a6a89fd8fe0c6f59a2124101057324-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:38:29.977720 master-0 kubenswrapper[13984]: I0312 12:38:29.977665 13984 generic.go:334] "Generic (PLEG): container finished" podID="36c3b10a-4214-4e96-aea7-b424049bdda5" containerID="fd15b5c1ee5ef8b0a96b118f360d446c896b1f616aa0722682a35f941b565c57" exitCode=0 Mar 12 12:38:29.978188 master-0 kubenswrapper[13984]: I0312 12:38:29.977737 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"36c3b10a-4214-4e96-aea7-b424049bdda5","Type":"ContainerDied","Data":"fd15b5c1ee5ef8b0a96b118f360d446c896b1f616aa0722682a35f941b565c57"} Mar 12 12:38:29.980396 master-0 kubenswrapper[13984]: I0312 12:38:29.980352 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d2a6a89fd8fe0c6f59a2124101057324/kube-controller-manager-cert-syncer/0.log" Mar 12 12:38:29.981928 master-0 kubenswrapper[13984]: I0312 12:38:29.981905 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d2a6a89fd8fe0c6f59a2124101057324/kube-controller-manager/0.log" Mar 12 12:38:29.981985 master-0 kubenswrapper[13984]: I0312 12:38:29.981949 13984 generic.go:334] "Generic (PLEG): container finished" podID="d2a6a89fd8fe0c6f59a2124101057324" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" exitCode=0 Mar 12 12:38:29.981985 master-0 kubenswrapper[13984]: I0312 12:38:29.981964 13984 generic.go:334] "Generic (PLEG): container finished" podID="d2a6a89fd8fe0c6f59a2124101057324" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" exitCode=0 Mar 12 12:38:29.981985 master-0 kubenswrapper[13984]: I0312 12:38:29.981973 13984 generic.go:334] "Generic (PLEG): container finished" podID="d2a6a89fd8fe0c6f59a2124101057324" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" exitCode=2 Mar 12 12:38:29.981985 master-0 kubenswrapper[13984]: I0312 12:38:29.981981 13984 generic.go:334] "Generic (PLEG): container finished" podID="d2a6a89fd8fe0c6f59a2124101057324" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" exitCode=0 Mar 12 12:38:29.982097 master-0 kubenswrapper[13984]: I0312 12:38:29.982077 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:29.988336 master-0 kubenswrapper[13984]: I0312 12:38:29.988293 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2a6a89fd8fe0c6f59a2124101057324" path="/var/lib/kubelet/pods/d2a6a89fd8fe0c6f59a2124101057324/volumes" Mar 12 12:38:29.989408 master-0 kubenswrapper[13984]: I0312 12:38:29.989382 13984 scope.go:117] "RemoveContainer" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" Mar 12 12:38:30.011802 master-0 kubenswrapper[13984]: I0312 12:38:30.010440 13984 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="d2a6a89fd8fe0c6f59a2124101057324" podUID="5976cfb1534d8e5b07a26777aadb94a1" Mar 12 12:38:30.018936 master-0 kubenswrapper[13984]: I0312 12:38:30.018890 13984 scope.go:117] "RemoveContainer" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" Mar 12 12:38:30.035691 master-0 kubenswrapper[13984]: I0312 12:38:30.035661 13984 scope.go:117] "RemoveContainer" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" Mar 12 12:38:30.063105 master-0 kubenswrapper[13984]: I0312 12:38:30.063039 13984 scope.go:117] "RemoveContainer" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" Mar 12 12:38:30.083845 master-0 kubenswrapper[13984]: I0312 12:38:30.083784 13984 scope.go:117] "RemoveContainer" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:38:30.106931 master-0 kubenswrapper[13984]: I0312 12:38:30.106875 13984 scope.go:117] "RemoveContainer" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" Mar 12 12:38:30.107835 master-0 kubenswrapper[13984]: E0312 12:38:30.107790 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": container with ID starting with e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f not found: ID does not exist" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" Mar 12 12:38:30.107931 master-0 kubenswrapper[13984]: I0312 12:38:30.107835 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f"} err="failed to get container status \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": rpc error: code = NotFound desc = could not find container \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": container with ID starting with e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f not found: ID does not exist" Mar 12 12:38:30.107931 master-0 kubenswrapper[13984]: I0312 12:38:30.107859 13984 scope.go:117] "RemoveContainer" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" Mar 12 12:38:30.108396 master-0 kubenswrapper[13984]: E0312 12:38:30.108353 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": container with ID starting with 6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c not found: ID does not exist" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" Mar 12 12:38:30.108458 master-0 kubenswrapper[13984]: I0312 12:38:30.108394 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c"} err="failed to get container status \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": rpc error: code = NotFound desc = could not find container \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": container with ID starting with 6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c not found: ID does not exist" Mar 12 12:38:30.108458 master-0 kubenswrapper[13984]: I0312 12:38:30.108410 13984 scope.go:117] "RemoveContainer" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" Mar 12 12:38:30.109438 master-0 kubenswrapper[13984]: E0312 12:38:30.109402 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": container with ID starting with 8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63 not found: ID does not exist" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" Mar 12 12:38:30.109519 master-0 kubenswrapper[13984]: I0312 12:38:30.109435 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63"} err="failed to get container status \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": rpc error: code = NotFound desc = could not find container \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": container with ID starting with 8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63 not found: ID does not exist" Mar 12 12:38:30.109519 master-0 kubenswrapper[13984]: I0312 12:38:30.109452 13984 scope.go:117] "RemoveContainer" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" Mar 12 12:38:30.109917 master-0 kubenswrapper[13984]: E0312 12:38:30.109890 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": container with ID starting with 5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a not found: ID does not exist" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" Mar 12 12:38:30.109985 master-0 kubenswrapper[13984]: I0312 12:38:30.109922 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a"} err="failed to get container status \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": rpc error: code = NotFound desc = could not find container \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": container with ID starting with 5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a not found: ID does not exist" Mar 12 12:38:30.109985 master-0 kubenswrapper[13984]: I0312 12:38:30.109937 13984 scope.go:117] "RemoveContainer" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:38:30.110412 master-0 kubenswrapper[13984]: E0312 12:38:30.110377 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": container with ID starting with f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482 not found: ID does not exist" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:38:30.110472 master-0 kubenswrapper[13984]: I0312 12:38:30.110411 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482"} err="failed to get container status \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": rpc error: code = NotFound desc = could not find container \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": container with ID starting with f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482 not found: ID does not exist" Mar 12 12:38:30.110472 master-0 kubenswrapper[13984]: I0312 12:38:30.110431 13984 scope.go:117] "RemoveContainer" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" Mar 12 12:38:30.110891 master-0 kubenswrapper[13984]: I0312 12:38:30.110840 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f"} err="failed to get container status \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": rpc error: code = NotFound desc = could not find container \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": container with ID starting with e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f not found: ID does not exist" Mar 12 12:38:30.110891 master-0 kubenswrapper[13984]: I0312 12:38:30.110874 13984 scope.go:117] "RemoveContainer" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" Mar 12 12:38:30.111171 master-0 kubenswrapper[13984]: I0312 12:38:30.111144 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c"} err="failed to get container status \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": rpc error: code = NotFound desc = could not find container \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": container with ID starting with 6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c not found: ID does not exist" Mar 12 12:38:30.111235 master-0 kubenswrapper[13984]: I0312 12:38:30.111172 13984 scope.go:117] "RemoveContainer" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" Mar 12 12:38:30.111487 master-0 kubenswrapper[13984]: I0312 12:38:30.111426 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63"} err="failed to get container status \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": rpc error: code = NotFound desc = could not find container \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": container with ID starting with 8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63 not found: ID does not exist" Mar 12 12:38:30.111549 master-0 kubenswrapper[13984]: I0312 12:38:30.111492 13984 scope.go:117] "RemoveContainer" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" Mar 12 12:38:30.111887 master-0 kubenswrapper[13984]: I0312 12:38:30.111826 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a"} err="failed to get container status \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": rpc error: code = NotFound desc = could not find container \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": container with ID starting with 5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a not found: ID does not exist" Mar 12 12:38:30.111887 master-0 kubenswrapper[13984]: I0312 12:38:30.111846 13984 scope.go:117] "RemoveContainer" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:38:30.112173 master-0 kubenswrapper[13984]: I0312 12:38:30.112146 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482"} err="failed to get container status \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": rpc error: code = NotFound desc = could not find container \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": container with ID starting with f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482 not found: ID does not exist" Mar 12 12:38:30.112173 master-0 kubenswrapper[13984]: I0312 12:38:30.112167 13984 scope.go:117] "RemoveContainer" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" Mar 12 12:38:30.112747 master-0 kubenswrapper[13984]: I0312 12:38:30.112724 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f"} err="failed to get container status \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": rpc error: code = NotFound desc = could not find container \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": container with ID starting with e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f not found: ID does not exist" Mar 12 12:38:30.112747 master-0 kubenswrapper[13984]: I0312 12:38:30.112745 13984 scope.go:117] "RemoveContainer" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" Mar 12 12:38:30.113041 master-0 kubenswrapper[13984]: I0312 12:38:30.113021 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c"} err="failed to get container status \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": rpc error: code = NotFound desc = could not find container \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": container with ID starting with 6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c not found: ID does not exist" Mar 12 12:38:30.113086 master-0 kubenswrapper[13984]: I0312 12:38:30.113039 13984 scope.go:117] "RemoveContainer" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" Mar 12 12:38:30.113589 master-0 kubenswrapper[13984]: I0312 12:38:30.113569 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63"} err="failed to get container status \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": rpc error: code = NotFound desc = could not find container \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": container with ID starting with 8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63 not found: ID does not exist" Mar 12 12:38:30.113650 master-0 kubenswrapper[13984]: I0312 12:38:30.113589 13984 scope.go:117] "RemoveContainer" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" Mar 12 12:38:30.113947 master-0 kubenswrapper[13984]: I0312 12:38:30.113894 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a"} err="failed to get container status \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": rpc error: code = NotFound desc = could not find container \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": container with ID starting with 5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a not found: ID does not exist" Mar 12 12:38:30.113995 master-0 kubenswrapper[13984]: I0312 12:38:30.113944 13984 scope.go:117] "RemoveContainer" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:38:30.114745 master-0 kubenswrapper[13984]: I0312 12:38:30.114724 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482"} err="failed to get container status \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": rpc error: code = NotFound desc = could not find container \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": container with ID starting with f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482 not found: ID does not exist" Mar 12 12:38:30.114745 master-0 kubenswrapper[13984]: I0312 12:38:30.114742 13984 scope.go:117] "RemoveContainer" containerID="e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f" Mar 12 12:38:30.115000 master-0 kubenswrapper[13984]: I0312 12:38:30.114975 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f"} err="failed to get container status \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": rpc error: code = NotFound desc = could not find container \"e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f\": container with ID starting with e998882a66889cf7ee5cbafcfff0431c3f47c02bc8c4f20b7220a8077159081f not found: ID does not exist" Mar 12 12:38:30.115000 master-0 kubenswrapper[13984]: I0312 12:38:30.114993 13984 scope.go:117] "RemoveContainer" containerID="6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c" Mar 12 12:38:30.115446 master-0 kubenswrapper[13984]: I0312 12:38:30.115379 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c"} err="failed to get container status \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": rpc error: code = NotFound desc = could not find container \"6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c\": container with ID starting with 6d1bb55a89c22b24211039ca4accc79a72982444eb11e98b52742f4cb08a068c not found: ID does not exist" Mar 12 12:38:30.115529 master-0 kubenswrapper[13984]: I0312 12:38:30.115446 13984 scope.go:117] "RemoveContainer" containerID="8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63" Mar 12 12:38:30.115824 master-0 kubenswrapper[13984]: I0312 12:38:30.115794 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63"} err="failed to get container status \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": rpc error: code = NotFound desc = could not find container \"8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63\": container with ID starting with 8594032b895eec2495e854d9680eecfb61280ee5675e202762a9e4f3ffc24a63 not found: ID does not exist" Mar 12 12:38:30.115824 master-0 kubenswrapper[13984]: I0312 12:38:30.115817 13984 scope.go:117] "RemoveContainer" containerID="5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a" Mar 12 12:38:30.116098 master-0 kubenswrapper[13984]: I0312 12:38:30.116056 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a"} err="failed to get container status \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": rpc error: code = NotFound desc = could not find container \"5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a\": container with ID starting with 5418a0e1148515b4821119bc3ac5a9f33573dc8ca96f3160420712180663b58a not found: ID does not exist" Mar 12 12:38:30.116098 master-0 kubenswrapper[13984]: I0312 12:38:30.116085 13984 scope.go:117] "RemoveContainer" containerID="f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482" Mar 12 12:38:30.116372 master-0 kubenswrapper[13984]: I0312 12:38:30.116338 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482"} err="failed to get container status \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": rpc error: code = NotFound desc = could not find container \"f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482\": container with ID starting with f5276dacf8a3ef5efd9a4d0c8b61d4e119f83a1c305b6251f6938eec9a468482 not found: ID does not exist" Mar 12 12:38:31.266919 master-0 kubenswrapper[13984]: I0312 12:38:31.266796 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:38:31.370190 master-0 kubenswrapper[13984]: I0312 12:38:31.370122 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-kubelet-dir\") pod \"36c3b10a-4214-4e96-aea7-b424049bdda5\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " Mar 12 12:38:31.370418 master-0 kubenswrapper[13984]: I0312 12:38:31.370210 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36c3b10a-4214-4e96-aea7-b424049bdda5-kube-api-access\") pod \"36c3b10a-4214-4e96-aea7-b424049bdda5\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " Mar 12 12:38:31.370418 master-0 kubenswrapper[13984]: I0312 12:38:31.370252 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-var-lock\") pod \"36c3b10a-4214-4e96-aea7-b424049bdda5\" (UID: \"36c3b10a-4214-4e96-aea7-b424049bdda5\") " Mar 12 12:38:31.370418 master-0 kubenswrapper[13984]: I0312 12:38:31.370311 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "36c3b10a-4214-4e96-aea7-b424049bdda5" (UID: "36c3b10a-4214-4e96-aea7-b424049bdda5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:38:31.370418 master-0 kubenswrapper[13984]: I0312 12:38:31.370397 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-var-lock" (OuterVolumeSpecName: "var-lock") pod "36c3b10a-4214-4e96-aea7-b424049bdda5" (UID: "36c3b10a-4214-4e96-aea7-b424049bdda5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:38:31.374892 master-0 kubenswrapper[13984]: I0312 12:38:31.374813 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36c3b10a-4214-4e96-aea7-b424049bdda5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "36c3b10a-4214-4e96-aea7-b424049bdda5" (UID: "36c3b10a-4214-4e96-aea7-b424049bdda5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:38:31.472251 master-0 kubenswrapper[13984]: I0312 12:38:31.472164 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36c3b10a-4214-4e96-aea7-b424049bdda5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 12 12:38:31.472251 master-0 kubenswrapper[13984]: I0312 12:38:31.472235 13984 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 12 12:38:31.472251 master-0 kubenswrapper[13984]: I0312 12:38:31.472248 13984 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/36c3b10a-4214-4e96-aea7-b424049bdda5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:38:31.999338 master-0 kubenswrapper[13984]: I0312 12:38:31.999278 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 12 12:38:31.999645 master-0 kubenswrapper[13984]: I0312 12:38:31.999258 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"36c3b10a-4214-4e96-aea7-b424049bdda5","Type":"ContainerDied","Data":"4b871d2e92b012c12465516a512c615631a66a8c377af58558cfd25e92a1e1ff"} Mar 12 12:38:31.999645 master-0 kubenswrapper[13984]: I0312 12:38:31.999524 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b871d2e92b012c12465516a512c615631a66a8c377af58558cfd25e92a1e1ff" Mar 12 12:38:41.979131 master-0 kubenswrapper[13984]: I0312 12:38:41.979064 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:42.003681 master-0 kubenswrapper[13984]: I0312 12:38:42.003627 13984 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dff159df-5101-4844-b403-34acf3a139b9" Mar 12 12:38:42.003827 master-0 kubenswrapper[13984]: I0312 12:38:42.003690 13984 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="dff159df-5101-4844-b403-34acf3a139b9" Mar 12 12:38:42.111750 master-0 kubenswrapper[13984]: I0312 12:38:42.111661 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:38:42.115768 master-0 kubenswrapper[13984]: I0312 12:38:42.115686 13984 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:42.128821 master-0 kubenswrapper[13984]: I0312 12:38:42.128764 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:42.129338 master-0 kubenswrapper[13984]: I0312 12:38:42.129303 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:38:42.159894 master-0 kubenswrapper[13984]: I0312 12:38:42.159841 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 12 12:38:43.107923 master-0 kubenswrapper[13984]: I0312 12:38:43.107830 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5976cfb1534d8e5b07a26777aadb94a1","Type":"ContainerStarted","Data":"d096a03b7027f30b2b31175a0840c5b903a7e11514579977e0df4a20f9acd56d"} Mar 12 12:38:43.107923 master-0 kubenswrapper[13984]: I0312 12:38:43.107877 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5976cfb1534d8e5b07a26777aadb94a1","Type":"ContainerStarted","Data":"eed8a4778b8f7e06e93886a7b0cca6970bcfaf3609b675032425dd7fc1e30155"} Mar 12 12:38:43.107923 master-0 kubenswrapper[13984]: I0312 12:38:43.107888 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5976cfb1534d8e5b07a26777aadb94a1","Type":"ContainerStarted","Data":"5b58fcfffe3e8fe8d8d30a88f088de14c18f69ed65ce65029ab40091c8967d9c"} Mar 12 12:38:43.107923 master-0 kubenswrapper[13984]: I0312 12:38:43.107896 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5976cfb1534d8e5b07a26777aadb94a1","Type":"ContainerStarted","Data":"2f4f0cfa300669f053920800cfeab4c889b5429c4415092ae831959200398977"} Mar 12 12:38:44.122188 master-0 kubenswrapper[13984]: I0312 12:38:44.122090 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"5976cfb1534d8e5b07a26777aadb94a1","Type":"ContainerStarted","Data":"daead6d6701611807f54a7edbabf1586098713ae5ea6af4001e4f6c9a48e0261"} Mar 12 12:38:44.148895 master-0 kubenswrapper[13984]: I0312 12:38:44.148759 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.148727597 podStartE2EDuration="2.148727597s" podCreationTimestamp="2026-03-12 12:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:38:44.144387184 +0000 UTC m=+856.342402766" watchObservedRunningTime="2026-03-12 12:38:44.148727597 +0000 UTC m=+856.346743099" Mar 12 12:38:52.130072 master-0 kubenswrapper[13984]: I0312 12:38:52.129955 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.131254 master-0 kubenswrapper[13984]: I0312 12:38:52.130102 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.131254 master-0 kubenswrapper[13984]: I0312 12:38:52.130123 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.131254 master-0 kubenswrapper[13984]: I0312 12:38:52.130139 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.136469 master-0 kubenswrapper[13984]: I0312 12:38:52.136408 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.136740 master-0 kubenswrapper[13984]: I0312 12:38:52.136663 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.205610 master-0 kubenswrapper[13984]: I0312 12:38:52.205468 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:38:52.205889 master-0 kubenswrapper[13984]: I0312 12:38:52.205840 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 12 12:39:04.093187 master-0 kubenswrapper[13984]: I0312 12:39:04.093123 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-86d746d899-wg2rc"] Mar 12 12:39:04.093887 master-0 kubenswrapper[13984]: E0312 12:39:04.093422 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36c3b10a-4214-4e96-aea7-b424049bdda5" containerName="installer" Mar 12 12:39:04.093887 master-0 kubenswrapper[13984]: I0312 12:39:04.093434 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="36c3b10a-4214-4e96-aea7-b424049bdda5" containerName="installer" Mar 12 12:39:04.093887 master-0 kubenswrapper[13984]: I0312 12:39:04.093588 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="36c3b10a-4214-4e96-aea7-b424049bdda5" containerName="installer" Mar 12 12:39:04.094422 master-0 kubenswrapper[13984]: I0312 12:39:04.094371 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.127724 master-0 kubenswrapper[13984]: I0312 12:39:04.127654 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-86d746d899-wg2rc"] Mar 12 12:39:04.200866 master-0 kubenswrapper[13984]: I0312 12:39:04.200806 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pvck\" (UniqueName: \"kubernetes.io/projected/4b807f7c-db79-41c1-b956-9eba8a501385-kube-api-access-2pvck\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.201067 master-0 kubenswrapper[13984]: I0312 12:39:04.200885 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4b807f7c-db79-41c1-b956-9eba8a501385-os-client-config\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.201067 master-0 kubenswrapper[13984]: I0312 12:39:04.200907 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/4b807f7c-db79-41c1-b956-9eba8a501385-nova-console-recordings-pv\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.302085 master-0 kubenswrapper[13984]: I0312 12:39:04.302026 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pvck\" (UniqueName: \"kubernetes.io/projected/4b807f7c-db79-41c1-b956-9eba8a501385-kube-api-access-2pvck\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.302339 master-0 kubenswrapper[13984]: I0312 12:39:04.302282 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4b807f7c-db79-41c1-b956-9eba8a501385-os-client-config\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.302405 master-0 kubenswrapper[13984]: I0312 12:39:04.302364 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/4b807f7c-db79-41c1-b956-9eba8a501385-nova-console-recordings-pv\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.305734 master-0 kubenswrapper[13984]: I0312 12:39:04.305708 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/4b807f7c-db79-41c1-b956-9eba8a501385-os-client-config\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.318888 master-0 kubenswrapper[13984]: I0312 12:39:04.318851 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pvck\" (UniqueName: \"kubernetes.io/projected/4b807f7c-db79-41c1-b956-9eba8a501385-kube-api-access-2pvck\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:04.925816 master-0 kubenswrapper[13984]: I0312 12:39:04.925740 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/4b807f7c-db79-41c1-b956-9eba8a501385-nova-console-recordings-pv\") pod \"nova-console-recorder-86d746d899-wg2rc\" (UID: \"4b807f7c-db79-41c1-b956-9eba8a501385\") " pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:05.010501 master-0 kubenswrapper[13984]: I0312 12:39:05.010403 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" Mar 12 12:39:05.433710 master-0 kubenswrapper[13984]: I0312 12:39:05.433657 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-86d746d899-wg2rc"] Mar 12 12:39:05.500623 master-0 kubenswrapper[13984]: I0312 12:39:05.500540 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" event={"ID":"4b807f7c-db79-41c1-b956-9eba8a501385","Type":"ContainerStarted","Data":"7f10f2fda076e2aad3d694d86d912dae15eeea25ceb40827b86b27e6936b80c5"} Mar 12 12:39:15.584442 master-0 kubenswrapper[13984]: I0312 12:39:15.584372 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" event={"ID":"4b807f7c-db79-41c1-b956-9eba8a501385","Type":"ContainerStarted","Data":"23b81a9f613259210e782a0c929d03fe1fbc5f5020da4594119815e326ba9ba3"} Mar 12 12:39:15.584442 master-0 kubenswrapper[13984]: I0312 12:39:15.584437 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" event={"ID":"4b807f7c-db79-41c1-b956-9eba8a501385","Type":"ContainerStarted","Data":"ccb64af3333994da0c197ea89d2473fca81aac034799dcb8163133ee178245f9"} Mar 12 12:39:15.616531 master-0 kubenswrapper[13984]: I0312 12:39:15.616413 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-86d746d899-wg2rc" podStartSLOduration=2.007444823 podStartE2EDuration="11.616389313s" podCreationTimestamp="2026-03-12 12:39:04 +0000 UTC" firstStartedPulling="2026-03-12 12:39:05.439589999 +0000 UTC m=+877.637605491" lastFinishedPulling="2026-03-12 12:39:15.048534479 +0000 UTC m=+887.246549981" observedRunningTime="2026-03-12 12:39:15.610181734 +0000 UTC m=+887.808197246" watchObservedRunningTime="2026-03-12 12:39:15.616389313 +0000 UTC m=+887.814404825" Mar 12 12:39:43.345194 master-0 kubenswrapper[13984]: I0312 12:39:43.345094 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28"] Mar 12 12:39:43.351393 master-0 kubenswrapper[13984]: I0312 12:39:43.351342 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.364226 master-0 kubenswrapper[13984]: I0312 12:39:43.364160 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28"] Mar 12 12:39:43.405634 master-0 kubenswrapper[13984]: I0312 12:39:43.405552 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.405634 master-0 kubenswrapper[13984]: I0312 12:39:43.405616 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.405927 master-0 kubenswrapper[13984]: I0312 12:39:43.405730 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfr2r\" (UniqueName: \"kubernetes.io/projected/a3d56617-f941-499c-b302-a93826be292a-kube-api-access-cfr2r\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.507551 master-0 kubenswrapper[13984]: I0312 12:39:43.507454 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.507793 master-0 kubenswrapper[13984]: I0312 12:39:43.507745 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfr2r\" (UniqueName: \"kubernetes.io/projected/a3d56617-f941-499c-b302-a93826be292a-kube-api-access-cfr2r\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.507920 master-0 kubenswrapper[13984]: I0312 12:39:43.507860 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.508447 master-0 kubenswrapper[13984]: I0312 12:39:43.508419 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.508795 master-0 kubenswrapper[13984]: I0312 12:39:43.508752 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.527183 master-0 kubenswrapper[13984]: I0312 12:39:43.527103 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfr2r\" (UniqueName: \"kubernetes.io/projected/a3d56617-f941-499c-b302-a93826be292a-kube-api-access-cfr2r\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:43.671605 master-0 kubenswrapper[13984]: I0312 12:39:43.671394 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:44.130248 master-0 kubenswrapper[13984]: I0312 12:39:44.130170 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28"] Mar 12 12:39:44.849942 master-0 kubenswrapper[13984]: I0312 12:39:44.849797 13984 generic.go:334] "Generic (PLEG): container finished" podID="a3d56617-f941-499c-b302-a93826be292a" containerID="daca18b9865bbe0679055db031ccd299c42aeb4983073bd243cd6d5b29a797d3" exitCode=0 Mar 12 12:39:44.849942 master-0 kubenswrapper[13984]: I0312 12:39:44.849890 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" event={"ID":"a3d56617-f941-499c-b302-a93826be292a","Type":"ContainerDied","Data":"daca18b9865bbe0679055db031ccd299c42aeb4983073bd243cd6d5b29a797d3"} Mar 12 12:39:44.850687 master-0 kubenswrapper[13984]: I0312 12:39:44.849952 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" event={"ID":"a3d56617-f941-499c-b302-a93826be292a","Type":"ContainerStarted","Data":"9ea6f55463099927aece89384818f4b1f3d8232e5005c6ea7afddf26037bd867"} Mar 12 12:39:46.866614 master-0 kubenswrapper[13984]: I0312 12:39:46.866558 13984 generic.go:334] "Generic (PLEG): container finished" podID="a3d56617-f941-499c-b302-a93826be292a" containerID="ed0484426255174a01bb1642d09ab64329b859400f757be6967938c277060f34" exitCode=0 Mar 12 12:39:46.866614 master-0 kubenswrapper[13984]: I0312 12:39:46.866603 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" event={"ID":"a3d56617-f941-499c-b302-a93826be292a","Type":"ContainerDied","Data":"ed0484426255174a01bb1642d09ab64329b859400f757be6967938c277060f34"} Mar 12 12:39:47.877891 master-0 kubenswrapper[13984]: I0312 12:39:47.877816 13984 generic.go:334] "Generic (PLEG): container finished" podID="a3d56617-f941-499c-b302-a93826be292a" containerID="54ba42edaa2067cc9fe5b1571b92b4d3822b81110b0fb91f5401ba67b09f8929" exitCode=0 Mar 12 12:39:47.877891 master-0 kubenswrapper[13984]: I0312 12:39:47.877869 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" event={"ID":"a3d56617-f941-499c-b302-a93826be292a","Type":"ContainerDied","Data":"54ba42edaa2067cc9fe5b1571b92b4d3822b81110b0fb91f5401ba67b09f8929"} Mar 12 12:39:49.154344 master-0 kubenswrapper[13984]: I0312 12:39:49.154235 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:49.298071 master-0 kubenswrapper[13984]: I0312 12:39:49.297953 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-util\") pod \"a3d56617-f941-499c-b302-a93826be292a\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " Mar 12 12:39:49.298408 master-0 kubenswrapper[13984]: I0312 12:39:49.298188 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-bundle\") pod \"a3d56617-f941-499c-b302-a93826be292a\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " Mar 12 12:39:49.298408 master-0 kubenswrapper[13984]: I0312 12:39:49.298297 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfr2r\" (UniqueName: \"kubernetes.io/projected/a3d56617-f941-499c-b302-a93826be292a-kube-api-access-cfr2r\") pod \"a3d56617-f941-499c-b302-a93826be292a\" (UID: \"a3d56617-f941-499c-b302-a93826be292a\") " Mar 12 12:39:49.299737 master-0 kubenswrapper[13984]: I0312 12:39:49.299662 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-bundle" (OuterVolumeSpecName: "bundle") pod "a3d56617-f941-499c-b302-a93826be292a" (UID: "a3d56617-f941-499c-b302-a93826be292a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:39:49.302366 master-0 kubenswrapper[13984]: I0312 12:39:49.302152 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3d56617-f941-499c-b302-a93826be292a-kube-api-access-cfr2r" (OuterVolumeSpecName: "kube-api-access-cfr2r") pod "a3d56617-f941-499c-b302-a93826be292a" (UID: "a3d56617-f941-499c-b302-a93826be292a"). InnerVolumeSpecName "kube-api-access-cfr2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:39:49.321328 master-0 kubenswrapper[13984]: I0312 12:39:49.321172 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-util" (OuterVolumeSpecName: "util") pod "a3d56617-f941-499c-b302-a93826be292a" (UID: "a3d56617-f941-499c-b302-a93826be292a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:39:49.400307 master-0 kubenswrapper[13984]: I0312 12:39:49.400212 13984 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:39:49.400307 master-0 kubenswrapper[13984]: I0312 12:39:49.400286 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfr2r\" (UniqueName: \"kubernetes.io/projected/a3d56617-f941-499c-b302-a93826be292a-kube-api-access-cfr2r\") on node \"master-0\" DevicePath \"\"" Mar 12 12:39:49.400307 master-0 kubenswrapper[13984]: I0312 12:39:49.400299 13984 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a3d56617-f941-499c-b302-a93826be292a-util\") on node \"master-0\" DevicePath \"\"" Mar 12 12:39:49.896896 master-0 kubenswrapper[13984]: I0312 12:39:49.896773 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" event={"ID":"a3d56617-f941-499c-b302-a93826be292a","Type":"ContainerDied","Data":"9ea6f55463099927aece89384818f4b1f3d8232e5005c6ea7afddf26037bd867"} Mar 12 12:39:49.896896 master-0 kubenswrapper[13984]: I0312 12:39:49.896838 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ea6f55463099927aece89384818f4b1f3d8232e5005c6ea7afddf26037bd867" Mar 12 12:39:49.896896 master-0 kubenswrapper[13984]: I0312 12:39:49.896885 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4sqs28" Mar 12 12:39:55.933778 master-0 kubenswrapper[13984]: I0312 12:39:55.933692 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-79bc8f9f88-drd64"] Mar 12 12:39:55.934369 master-0 kubenswrapper[13984]: E0312 12:39:55.934106 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="util" Mar 12 12:39:55.934369 master-0 kubenswrapper[13984]: I0312 12:39:55.934144 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="util" Mar 12 12:39:55.934369 master-0 kubenswrapper[13984]: E0312 12:39:55.934162 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="extract" Mar 12 12:39:55.934369 master-0 kubenswrapper[13984]: I0312 12:39:55.934171 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="extract" Mar 12 12:39:55.934369 master-0 kubenswrapper[13984]: E0312 12:39:55.934234 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="pull" Mar 12 12:39:55.934369 master-0 kubenswrapper[13984]: I0312 12:39:55.934252 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="pull" Mar 12 12:39:55.934646 master-0 kubenswrapper[13984]: I0312 12:39:55.934521 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3d56617-f941-499c-b302-a93826be292a" containerName="extract" Mar 12 12:39:55.935279 master-0 kubenswrapper[13984]: I0312 12:39:55.935252 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:55.938515 master-0 kubenswrapper[13984]: I0312 12:39:55.938455 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 12 12:39:55.938689 master-0 kubenswrapper[13984]: I0312 12:39:55.938585 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 12 12:39:55.938769 master-0 kubenswrapper[13984]: I0312 12:39:55.938710 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 12 12:39:55.938830 master-0 kubenswrapper[13984]: I0312 12:39:55.938745 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 12 12:39:55.939564 master-0 kubenswrapper[13984]: I0312 12:39:55.939535 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 12 12:39:55.962503 master-0 kubenswrapper[13984]: I0312 12:39:55.962428 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-79bc8f9f88-drd64"] Mar 12 12:39:56.012780 master-0 kubenswrapper[13984]: I0312 12:39:56.012727 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-metrics-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.013526 master-0 kubenswrapper[13984]: I0312 12:39:56.013490 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/623f448a-5d67-4843-94cf-748de4fa25bb-socket-dir\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.013606 master-0 kubenswrapper[13984]: I0312 12:39:56.013574 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-webhook-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.013688 master-0 kubenswrapper[13984]: I0312 12:39:56.013664 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-apiservice-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.013802 master-0 kubenswrapper[13984]: I0312 12:39:56.013768 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxzjr\" (UniqueName: \"kubernetes.io/projected/623f448a-5d67-4843-94cf-748de4fa25bb-kube-api-access-rxzjr\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.114945 master-0 kubenswrapper[13984]: I0312 12:39:56.114875 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-apiservice-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.115142 master-0 kubenswrapper[13984]: I0312 12:39:56.114962 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxzjr\" (UniqueName: \"kubernetes.io/projected/623f448a-5d67-4843-94cf-748de4fa25bb-kube-api-access-rxzjr\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.115142 master-0 kubenswrapper[13984]: I0312 12:39:56.115000 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-metrics-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.115142 master-0 kubenswrapper[13984]: I0312 12:39:56.115083 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/623f448a-5d67-4843-94cf-748de4fa25bb-socket-dir\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.115233 master-0 kubenswrapper[13984]: I0312 12:39:56.115156 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-webhook-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.116323 master-0 kubenswrapper[13984]: I0312 12:39:56.116145 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/623f448a-5d67-4843-94cf-748de4fa25bb-socket-dir\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.118494 master-0 kubenswrapper[13984]: I0312 12:39:56.118454 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-metrics-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.118645 master-0 kubenswrapper[13984]: I0312 12:39:56.118602 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-webhook-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.118921 master-0 kubenswrapper[13984]: I0312 12:39:56.118898 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/623f448a-5d67-4843-94cf-748de4fa25bb-apiservice-cert\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.134921 master-0 kubenswrapper[13984]: I0312 12:39:56.134880 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxzjr\" (UniqueName: \"kubernetes.io/projected/623f448a-5d67-4843-94cf-748de4fa25bb-kube-api-access-rxzjr\") pod \"lvms-operator-79bc8f9f88-drd64\" (UID: \"623f448a-5d67-4843-94cf-748de4fa25bb\") " pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.257499 master-0 kubenswrapper[13984]: I0312 12:39:56.257427 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:39:56.677305 master-0 kubenswrapper[13984]: I0312 12:39:56.677056 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-79bc8f9f88-drd64"] Mar 12 12:39:56.685265 master-0 kubenswrapper[13984]: W0312 12:39:56.685216 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod623f448a_5d67_4843_94cf_748de4fa25bb.slice/crio-df09d28d8ebeba0cbb56ecf27c5bc853ad3f9b745778c9b1df9a83531da7aafb WatchSource:0}: Error finding container df09d28d8ebeba0cbb56ecf27c5bc853ad3f9b745778c9b1df9a83531da7aafb: Status 404 returned error can't find the container with id df09d28d8ebeba0cbb56ecf27c5bc853ad3f9b745778c9b1df9a83531da7aafb Mar 12 12:39:56.953886 master-0 kubenswrapper[13984]: I0312 12:39:56.952740 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" event={"ID":"623f448a-5d67-4843-94cf-748de4fa25bb","Type":"ContainerStarted","Data":"df09d28d8ebeba0cbb56ecf27c5bc853ad3f9b745778c9b1df9a83531da7aafb"} Mar 12 12:40:02.011961 master-0 kubenswrapper[13984]: I0312 12:40:02.010342 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:40:02.011961 master-0 kubenswrapper[13984]: I0312 12:40:02.010410 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" Mar 12 12:40:02.011961 master-0 kubenswrapper[13984]: I0312 12:40:02.010421 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" event={"ID":"623f448a-5d67-4843-94cf-748de4fa25bb","Type":"ContainerStarted","Data":"566dc39fc6529c97a94f2f1d527a5ebfd7f0669816d2f0b9e20f1eeb51bdf4b4"} Mar 12 12:40:02.032554 master-0 kubenswrapper[13984]: I0312 12:40:02.031201 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-79bc8f9f88-drd64" podStartSLOduration=2.185521313 podStartE2EDuration="7.031185312s" podCreationTimestamp="2026-03-12 12:39:55 +0000 UTC" firstStartedPulling="2026-03-12 12:39:56.68885957 +0000 UTC m=+928.886875062" lastFinishedPulling="2026-03-12 12:40:01.534523569 +0000 UTC m=+933.732539061" observedRunningTime="2026-03-12 12:40:02.026115689 +0000 UTC m=+934.224131201" watchObservedRunningTime="2026-03-12 12:40:02.031185312 +0000 UTC m=+934.229200804" Mar 12 12:40:05.283902 master-0 kubenswrapper[13984]: I0312 12:40:05.283805 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv"] Mar 12 12:40:05.286673 master-0 kubenswrapper[13984]: I0312 12:40:05.286608 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.294187 master-0 kubenswrapper[13984]: I0312 12:40:05.292992 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv"] Mar 12 12:40:05.353836 master-0 kubenswrapper[13984]: I0312 12:40:05.353783 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.354040 master-0 kubenswrapper[13984]: I0312 12:40:05.353906 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.354040 master-0 kubenswrapper[13984]: I0312 12:40:05.353946 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6r8f\" (UniqueName: \"kubernetes.io/projected/52309de3-a0c6-4d87-9b41-f5c729db8af8-kube-api-access-r6r8f\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.455217 master-0 kubenswrapper[13984]: I0312 12:40:05.455144 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.455464 master-0 kubenswrapper[13984]: I0312 12:40:05.455310 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.455464 master-0 kubenswrapper[13984]: I0312 12:40:05.455340 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6r8f\" (UniqueName: \"kubernetes.io/projected/52309de3-a0c6-4d87-9b41-f5c729db8af8-kube-api-access-r6r8f\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.455755 master-0 kubenswrapper[13984]: I0312 12:40:05.455707 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.456086 master-0 kubenswrapper[13984]: I0312 12:40:05.456026 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.483579 master-0 kubenswrapper[13984]: I0312 12:40:05.482964 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6r8f\" (UniqueName: \"kubernetes.io/projected/52309de3-a0c6-4d87-9b41-f5c729db8af8-kube-api-access-r6r8f\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.603876 master-0 kubenswrapper[13984]: I0312 12:40:05.603739 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:05.686182 master-0 kubenswrapper[13984]: I0312 12:40:05.686103 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k"] Mar 12 12:40:05.689327 master-0 kubenswrapper[13984]: I0312 12:40:05.687859 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.702190 master-0 kubenswrapper[13984]: I0312 12:40:05.702096 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k"] Mar 12 12:40:05.863974 master-0 kubenswrapper[13984]: I0312 12:40:05.863860 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vrwq\" (UniqueName: \"kubernetes.io/projected/b0b24956-b717-488b-88f8-7b71ceaf9c21-kube-api-access-7vrwq\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.864231 master-0 kubenswrapper[13984]: I0312 12:40:05.864215 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.864323 master-0 kubenswrapper[13984]: I0312 12:40:05.864309 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.966860 master-0 kubenswrapper[13984]: I0312 12:40:05.966264 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.966860 master-0 kubenswrapper[13984]: I0312 12:40:05.966443 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.966860 master-0 kubenswrapper[13984]: I0312 12:40:05.966627 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vrwq\" (UniqueName: \"kubernetes.io/projected/b0b24956-b717-488b-88f8-7b71ceaf9c21-kube-api-access-7vrwq\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.966860 master-0 kubenswrapper[13984]: I0312 12:40:05.966811 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.967134 master-0 kubenswrapper[13984]: I0312 12:40:05.966903 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:05.999433 master-0 kubenswrapper[13984]: I0312 12:40:05.999314 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vrwq\" (UniqueName: \"kubernetes.io/projected/b0b24956-b717-488b-88f8-7b71ceaf9c21-kube-api-access-7vrwq\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:06.022527 master-0 kubenswrapper[13984]: I0312 12:40:06.021817 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:06.053301 master-0 kubenswrapper[13984]: I0312 12:40:06.053255 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv"] Mar 12 12:40:06.057324 master-0 kubenswrapper[13984]: W0312 12:40:06.057284 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52309de3_a0c6_4d87_9b41_f5c729db8af8.slice/crio-d586409097d2c3c709464583b4116202552a48181a4612667c5391f6f4686c3d WatchSource:0}: Error finding container d586409097d2c3c709464583b4116202552a48181a4612667c5391f6f4686c3d: Status 404 returned error can't find the container with id d586409097d2c3c709464583b4116202552a48181a4612667c5391f6f4686c3d Mar 12 12:40:06.419893 master-0 kubenswrapper[13984]: I0312 12:40:06.419632 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k"] Mar 12 12:40:06.479456 master-0 kubenswrapper[13984]: I0312 12:40:06.479337 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv"] Mar 12 12:40:06.504351 master-0 kubenswrapper[13984]: I0312 12:40:06.504184 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.504674 master-0 kubenswrapper[13984]: I0312 12:40:06.503334 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv"] Mar 12 12:40:06.578023 master-0 kubenswrapper[13984]: I0312 12:40:06.577972 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv9qz\" (UniqueName: \"kubernetes.io/projected/4899bb2e-d091-4aa6-82f5-002af8bca9fe-kube-api-access-qv9qz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.578203 master-0 kubenswrapper[13984]: I0312 12:40:06.578043 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.578203 master-0 kubenswrapper[13984]: I0312 12:40:06.578098 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.679619 master-0 kubenswrapper[13984]: I0312 12:40:06.679465 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.679619 master-0 kubenswrapper[13984]: I0312 12:40:06.679582 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv9qz\" (UniqueName: \"kubernetes.io/projected/4899bb2e-d091-4aa6-82f5-002af8bca9fe-kube-api-access-qv9qz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.679898 master-0 kubenswrapper[13984]: I0312 12:40:06.679628 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.679973 master-0 kubenswrapper[13984]: I0312 12:40:06.679930 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.680756 master-0 kubenswrapper[13984]: I0312 12:40:06.680017 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.694205 master-0 kubenswrapper[13984]: I0312 12:40:06.694155 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv9qz\" (UniqueName: \"kubernetes.io/projected/4899bb2e-d091-4aa6-82f5-002af8bca9fe-kube-api-access-qv9qz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:06.842140 master-0 kubenswrapper[13984]: I0312 12:40:06.842062 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:07.042820 master-0 kubenswrapper[13984]: I0312 12:40:07.042768 13984 generic.go:334] "Generic (PLEG): container finished" podID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerID="35d624ba79251b38d11af49b83cc4f2b983fbb8126fe976ce2527622c670dfd8" exitCode=0 Mar 12 12:40:07.043026 master-0 kubenswrapper[13984]: I0312 12:40:07.042856 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerDied","Data":"35d624ba79251b38d11af49b83cc4f2b983fbb8126fe976ce2527622c670dfd8"} Mar 12 12:40:07.043026 master-0 kubenswrapper[13984]: I0312 12:40:07.042948 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerStarted","Data":"9029a3037debad09c77bf9f078049aa342f89ef7a706a2553b5044a36a65b187"} Mar 12 12:40:07.044771 master-0 kubenswrapper[13984]: I0312 12:40:07.044742 13984 generic.go:334] "Generic (PLEG): container finished" podID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerID="17dc7cc75484aa52ab3c6cc6ff444f63add7b490cb73dcc346c8eda1388e44ce" exitCode=0 Mar 12 12:40:07.044872 master-0 kubenswrapper[13984]: I0312 12:40:07.044801 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerDied","Data":"17dc7cc75484aa52ab3c6cc6ff444f63add7b490cb73dcc346c8eda1388e44ce"} Mar 12 12:40:07.044872 master-0 kubenswrapper[13984]: I0312 12:40:07.044824 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerStarted","Data":"d586409097d2c3c709464583b4116202552a48181a4612667c5391f6f4686c3d"} Mar 12 12:40:07.280284 master-0 kubenswrapper[13984]: I0312 12:40:07.280158 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv"] Mar 12 12:40:07.291246 master-0 kubenswrapper[13984]: W0312 12:40:07.291061 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4899bb2e_d091_4aa6_82f5_002af8bca9fe.slice/crio-9dc2c24bacc9bdc68a4feb8c0da10d5b9d72937223aa7c747f0348087eafe6bf WatchSource:0}: Error finding container 9dc2c24bacc9bdc68a4feb8c0da10d5b9d72937223aa7c747f0348087eafe6bf: Status 404 returned error can't find the container with id 9dc2c24bacc9bdc68a4feb8c0da10d5b9d72937223aa7c747f0348087eafe6bf Mar 12 12:40:08.058492 master-0 kubenswrapper[13984]: I0312 12:40:08.058405 13984 generic.go:334] "Generic (PLEG): container finished" podID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerID="2ad2fdf3a1d80b2bae79516844b96443bd06ef89e27b208a3c8382b1a59c3e5b" exitCode=0 Mar 12 12:40:08.059033 master-0 kubenswrapper[13984]: I0312 12:40:08.058526 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" event={"ID":"4899bb2e-d091-4aa6-82f5-002af8bca9fe","Type":"ContainerDied","Data":"2ad2fdf3a1d80b2bae79516844b96443bd06ef89e27b208a3c8382b1a59c3e5b"} Mar 12 12:40:08.059033 master-0 kubenswrapper[13984]: I0312 12:40:08.058581 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" event={"ID":"4899bb2e-d091-4aa6-82f5-002af8bca9fe","Type":"ContainerStarted","Data":"9dc2c24bacc9bdc68a4feb8c0da10d5b9d72937223aa7c747f0348087eafe6bf"} Mar 12 12:40:09.067342 master-0 kubenswrapper[13984]: I0312 12:40:09.067266 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerStarted","Data":"2d449c6b8364ed73ef7246c1eb1729a117038cfc52ca32686d862c44896553b5"} Mar 12 12:40:09.069320 master-0 kubenswrapper[13984]: I0312 12:40:09.069287 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerStarted","Data":"1c4eb3787b16621bd60bb9fdf9a3a1054ba1c78d21a42bf93da6ddcc718286a1"} Mar 12 12:40:10.078985 master-0 kubenswrapper[13984]: I0312 12:40:10.078930 13984 generic.go:334] "Generic (PLEG): container finished" podID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerID="1c4eb3787b16621bd60bb9fdf9a3a1054ba1c78d21a42bf93da6ddcc718286a1" exitCode=0 Mar 12 12:40:10.079564 master-0 kubenswrapper[13984]: I0312 12:40:10.079043 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerDied","Data":"1c4eb3787b16621bd60bb9fdf9a3a1054ba1c78d21a42bf93da6ddcc718286a1"} Mar 12 12:40:10.081009 master-0 kubenswrapper[13984]: I0312 12:40:10.080982 13984 generic.go:334] "Generic (PLEG): container finished" podID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerID="2d449c6b8364ed73ef7246c1eb1729a117038cfc52ca32686d862c44896553b5" exitCode=0 Mar 12 12:40:10.081101 master-0 kubenswrapper[13984]: I0312 12:40:10.081032 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerDied","Data":"2d449c6b8364ed73ef7246c1eb1729a117038cfc52ca32686d862c44896553b5"} Mar 12 12:40:11.093570 master-0 kubenswrapper[13984]: I0312 12:40:11.093292 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerStarted","Data":"5cf0b12850e6b245e4b48212c693fe28d62c8c490e433418ac2493a43da4724b"} Mar 12 12:40:11.096795 master-0 kubenswrapper[13984]: I0312 12:40:11.096727 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerStarted","Data":"971ce147ebce6691a09006cb22969e875cda698afbf5fe0cd5ae82bd7c69fbb7"} Mar 12 12:40:11.352602 master-0 kubenswrapper[13984]: I0312 12:40:11.351156 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" podStartSLOduration=4.535950894 podStartE2EDuration="6.351127787s" podCreationTimestamp="2026-03-12 12:40:05 +0000 UTC" firstStartedPulling="2026-03-12 12:40:07.045876758 +0000 UTC m=+939.243892250" lastFinishedPulling="2026-03-12 12:40:08.861053621 +0000 UTC m=+941.059069143" observedRunningTime="2026-03-12 12:40:11.3463478 +0000 UTC m=+943.544363312" watchObservedRunningTime="2026-03-12 12:40:11.351127787 +0000 UTC m=+943.549143269" Mar 12 12:40:11.378250 master-0 kubenswrapper[13984]: I0312 12:40:11.378150 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" podStartSLOduration=4.802541049 podStartE2EDuration="6.37812548s" podCreationTimestamp="2026-03-12 12:40:05 +0000 UTC" firstStartedPulling="2026-03-12 12:40:07.046116673 +0000 UTC m=+939.244132165" lastFinishedPulling="2026-03-12 12:40:08.621701064 +0000 UTC m=+940.819716596" observedRunningTime="2026-03-12 12:40:11.365708933 +0000 UTC m=+943.563724435" watchObservedRunningTime="2026-03-12 12:40:11.37812548 +0000 UTC m=+943.576140972" Mar 12 12:40:13.112937 master-0 kubenswrapper[13984]: I0312 12:40:13.112850 13984 generic.go:334] "Generic (PLEG): container finished" podID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerID="971ce147ebce6691a09006cb22969e875cda698afbf5fe0cd5ae82bd7c69fbb7" exitCode=0 Mar 12 12:40:13.112937 master-0 kubenswrapper[13984]: I0312 12:40:13.112919 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerDied","Data":"971ce147ebce6691a09006cb22969e875cda698afbf5fe0cd5ae82bd7c69fbb7"} Mar 12 12:40:13.117235 master-0 kubenswrapper[13984]: I0312 12:40:13.116995 13984 generic.go:334] "Generic (PLEG): container finished" podID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerID="e96222f7a65c20fa31286d75ec36d94a5573c2c089a6ec8e6c1c31d24d09e72c" exitCode=0 Mar 12 12:40:13.117235 master-0 kubenswrapper[13984]: I0312 12:40:13.117151 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" event={"ID":"4899bb2e-d091-4aa6-82f5-002af8bca9fe","Type":"ContainerDied","Data":"e96222f7a65c20fa31286d75ec36d94a5573c2c089a6ec8e6c1c31d24d09e72c"} Mar 12 12:40:13.121744 master-0 kubenswrapper[13984]: I0312 12:40:13.121654 13984 generic.go:334] "Generic (PLEG): container finished" podID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerID="5cf0b12850e6b245e4b48212c693fe28d62c8c490e433418ac2493a43da4724b" exitCode=0 Mar 12 12:40:13.121833 master-0 kubenswrapper[13984]: I0312 12:40:13.121734 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerDied","Data":"5cf0b12850e6b245e4b48212c693fe28d62c8c490e433418ac2493a43da4724b"} Mar 12 12:40:14.138218 master-0 kubenswrapper[13984]: I0312 12:40:14.137122 13984 generic.go:334] "Generic (PLEG): container finished" podID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerID="e72fc72e3a5c9a4dd049c26bef1c64e144bb521c24d190b356c90371e0471159" exitCode=0 Mar 12 12:40:14.138218 master-0 kubenswrapper[13984]: I0312 12:40:14.137907 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" event={"ID":"4899bb2e-d091-4aa6-82f5-002af8bca9fe","Type":"ContainerDied","Data":"e72fc72e3a5c9a4dd049c26bef1c64e144bb521c24d190b356c90371e0471159"} Mar 12 12:40:14.620567 master-0 kubenswrapper[13984]: I0312 12:40:14.620144 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:14.628855 master-0 kubenswrapper[13984]: I0312 12:40:14.628803 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:14.644495 master-0 kubenswrapper[13984]: I0312 12:40:14.644353 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-bundle\") pod \"52309de3-a0c6-4d87-9b41-f5c729db8af8\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " Mar 12 12:40:14.644692 master-0 kubenswrapper[13984]: I0312 12:40:14.644530 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-util\") pod \"b0b24956-b717-488b-88f8-7b71ceaf9c21\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " Mar 12 12:40:14.644692 master-0 kubenswrapper[13984]: I0312 12:40:14.644558 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vrwq\" (UniqueName: \"kubernetes.io/projected/b0b24956-b717-488b-88f8-7b71ceaf9c21-kube-api-access-7vrwq\") pod \"b0b24956-b717-488b-88f8-7b71ceaf9c21\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " Mar 12 12:40:14.644692 master-0 kubenswrapper[13984]: I0312 12:40:14.644606 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-bundle\") pod \"b0b24956-b717-488b-88f8-7b71ceaf9c21\" (UID: \"b0b24956-b717-488b-88f8-7b71ceaf9c21\") " Mar 12 12:40:14.644692 master-0 kubenswrapper[13984]: I0312 12:40:14.644634 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-util\") pod \"52309de3-a0c6-4d87-9b41-f5c729db8af8\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " Mar 12 12:40:14.644692 master-0 kubenswrapper[13984]: I0312 12:40:14.644667 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6r8f\" (UniqueName: \"kubernetes.io/projected/52309de3-a0c6-4d87-9b41-f5c729db8af8-kube-api-access-r6r8f\") pod \"52309de3-a0c6-4d87-9b41-f5c729db8af8\" (UID: \"52309de3-a0c6-4d87-9b41-f5c729db8af8\") " Mar 12 12:40:14.647205 master-0 kubenswrapper[13984]: I0312 12:40:14.647168 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-bundle" (OuterVolumeSpecName: "bundle") pod "52309de3-a0c6-4d87-9b41-f5c729db8af8" (UID: "52309de3-a0c6-4d87-9b41-f5c729db8af8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:14.647683 master-0 kubenswrapper[13984]: I0312 12:40:14.647601 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-bundle" (OuterVolumeSpecName: "bundle") pod "b0b24956-b717-488b-88f8-7b71ceaf9c21" (UID: "b0b24956-b717-488b-88f8-7b71ceaf9c21"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:14.648126 master-0 kubenswrapper[13984]: I0312 12:40:14.648043 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52309de3-a0c6-4d87-9b41-f5c729db8af8-kube-api-access-r6r8f" (OuterVolumeSpecName: "kube-api-access-r6r8f") pod "52309de3-a0c6-4d87-9b41-f5c729db8af8" (UID: "52309de3-a0c6-4d87-9b41-f5c729db8af8"). InnerVolumeSpecName "kube-api-access-r6r8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:40:14.650309 master-0 kubenswrapper[13984]: I0312 12:40:14.650279 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0b24956-b717-488b-88f8-7b71ceaf9c21-kube-api-access-7vrwq" (OuterVolumeSpecName: "kube-api-access-7vrwq") pod "b0b24956-b717-488b-88f8-7b71ceaf9c21" (UID: "b0b24956-b717-488b-88f8-7b71ceaf9c21"). InnerVolumeSpecName "kube-api-access-7vrwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:40:14.659742 master-0 kubenswrapper[13984]: I0312 12:40:14.658896 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-util" (OuterVolumeSpecName: "util") pod "b0b24956-b717-488b-88f8-7b71ceaf9c21" (UID: "b0b24956-b717-488b-88f8-7b71ceaf9c21"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:14.663969 master-0 kubenswrapper[13984]: I0312 12:40:14.663607 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-util" (OuterVolumeSpecName: "util") pod "52309de3-a0c6-4d87-9b41-f5c729db8af8" (UID: "52309de3-a0c6-4d87-9b41-f5c729db8af8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:14.745882 master-0 kubenswrapper[13984]: I0312 12:40:14.745812 13984 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-util\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:14.745882 master-0 kubenswrapper[13984]: I0312 12:40:14.745870 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6r8f\" (UniqueName: \"kubernetes.io/projected/52309de3-a0c6-4d87-9b41-f5c729db8af8-kube-api-access-r6r8f\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:14.745882 master-0 kubenswrapper[13984]: I0312 12:40:14.745882 13984 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/52309de3-a0c6-4d87-9b41-f5c729db8af8-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:14.745882 master-0 kubenswrapper[13984]: I0312 12:40:14.745891 13984 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-util\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:14.746168 master-0 kubenswrapper[13984]: I0312 12:40:14.745915 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vrwq\" (UniqueName: \"kubernetes.io/projected/b0b24956-b717-488b-88f8-7b71ceaf9c21-kube-api-access-7vrwq\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:14.746168 master-0 kubenswrapper[13984]: I0312 12:40:14.745924 13984 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b0b24956-b717-488b-88f8-7b71ceaf9c21-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:15.107421 master-0 kubenswrapper[13984]: I0312 12:40:15.107318 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6"] Mar 12 12:40:15.107797 master-0 kubenswrapper[13984]: E0312 12:40:15.107758 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="extract" Mar 12 12:40:15.107797 master-0 kubenswrapper[13984]: I0312 12:40:15.107786 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="extract" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: E0312 12:40:15.107824 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="extract" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: I0312 12:40:15.107837 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="extract" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: E0312 12:40:15.107852 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="pull" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: I0312 12:40:15.107863 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="pull" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: E0312 12:40:15.107891 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="pull" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: I0312 12:40:15.107902 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="pull" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: E0312 12:40:15.107915 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="util" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: I0312 12:40:15.107926 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="util" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: E0312 12:40:15.107937 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="util" Mar 12 12:40:15.107966 master-0 kubenswrapper[13984]: I0312 12:40:15.107948 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="util" Mar 12 12:40:15.108592 master-0 kubenswrapper[13984]: I0312 12:40:15.108187 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="52309de3-a0c6-4d87-9b41-f5c729db8af8" containerName="extract" Mar 12 12:40:15.108592 master-0 kubenswrapper[13984]: I0312 12:40:15.108254 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0b24956-b717-488b-88f8-7b71ceaf9c21" containerName="extract" Mar 12 12:40:15.109847 master-0 kubenswrapper[13984]: I0312 12:40:15.109807 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.140859 master-0 kubenswrapper[13984]: I0312 12:40:15.140123 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6"] Mar 12 12:40:15.148456 master-0 kubenswrapper[13984]: I0312 12:40:15.148344 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" event={"ID":"52309de3-a0c6-4d87-9b41-f5c729db8af8","Type":"ContainerDied","Data":"d586409097d2c3c709464583b4116202552a48181a4612667c5391f6f4686c3d"} Mar 12 12:40:15.148456 master-0 kubenswrapper[13984]: I0312 12:40:15.148392 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d586409097d2c3c709464583b4116202552a48181a4612667c5391f6f4686c3d" Mar 12 12:40:15.148820 master-0 kubenswrapper[13984]: I0312 12:40:15.148557 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874kjwzv" Mar 12 12:40:15.152069 master-0 kubenswrapper[13984]: I0312 12:40:15.152022 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" Mar 12 12:40:15.155805 master-0 kubenswrapper[13984]: I0312 12:40:15.155754 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1bbz9k" event={"ID":"b0b24956-b717-488b-88f8-7b71ceaf9c21","Type":"ContainerDied","Data":"9029a3037debad09c77bf9f078049aa342f89ef7a706a2553b5044a36a65b187"} Mar 12 12:40:15.155883 master-0 kubenswrapper[13984]: I0312 12:40:15.155807 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9029a3037debad09c77bf9f078049aa342f89ef7a706a2553b5044a36a65b187" Mar 12 12:40:15.155996 master-0 kubenswrapper[13984]: I0312 12:40:15.155907 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.156656 master-0 kubenswrapper[13984]: I0312 12:40:15.156587 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.156732 master-0 kubenswrapper[13984]: I0312 12:40:15.156704 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk6zp\" (UniqueName: \"kubernetes.io/projected/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-kube-api-access-hk6zp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.258856 master-0 kubenswrapper[13984]: I0312 12:40:15.258273 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.258856 master-0 kubenswrapper[13984]: I0312 12:40:15.258366 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.258856 master-0 kubenswrapper[13984]: I0312 12:40:15.258393 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk6zp\" (UniqueName: \"kubernetes.io/projected/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-kube-api-access-hk6zp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.259131 master-0 kubenswrapper[13984]: I0312 12:40:15.258866 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.261499 master-0 kubenswrapper[13984]: I0312 12:40:15.260234 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.279749 master-0 kubenswrapper[13984]: I0312 12:40:15.279693 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk6zp\" (UniqueName: \"kubernetes.io/projected/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-kube-api-access-hk6zp\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.433501 master-0 kubenswrapper[13984]: I0312 12:40:15.433276 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:15.438286 master-0 kubenswrapper[13984]: I0312 12:40:15.438241 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:15.463037 master-0 kubenswrapper[13984]: I0312 12:40:15.461050 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv9qz\" (UniqueName: \"kubernetes.io/projected/4899bb2e-d091-4aa6-82f5-002af8bca9fe-kube-api-access-qv9qz\") pod \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " Mar 12 12:40:15.463037 master-0 kubenswrapper[13984]: I0312 12:40:15.461127 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-util\") pod \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " Mar 12 12:40:15.463037 master-0 kubenswrapper[13984]: I0312 12:40:15.461308 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-bundle\") pod \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\" (UID: \"4899bb2e-d091-4aa6-82f5-002af8bca9fe\") " Mar 12 12:40:15.463037 master-0 kubenswrapper[13984]: I0312 12:40:15.462955 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-bundle" (OuterVolumeSpecName: "bundle") pod "4899bb2e-d091-4aa6-82f5-002af8bca9fe" (UID: "4899bb2e-d091-4aa6-82f5-002af8bca9fe"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:15.468748 master-0 kubenswrapper[13984]: I0312 12:40:15.468696 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4899bb2e-d091-4aa6-82f5-002af8bca9fe-kube-api-access-qv9qz" (OuterVolumeSpecName: "kube-api-access-qv9qz") pod "4899bb2e-d091-4aa6-82f5-002af8bca9fe" (UID: "4899bb2e-d091-4aa6-82f5-002af8bca9fe"). InnerVolumeSpecName "kube-api-access-qv9qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:40:15.478639 master-0 kubenswrapper[13984]: I0312 12:40:15.478574 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-util" (OuterVolumeSpecName: "util") pod "4899bb2e-d091-4aa6-82f5-002af8bca9fe" (UID: "4899bb2e-d091-4aa6-82f5-002af8bca9fe"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:15.562984 master-0 kubenswrapper[13984]: I0312 12:40:15.562866 13984 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:15.562984 master-0 kubenswrapper[13984]: I0312 12:40:15.562925 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv9qz\" (UniqueName: \"kubernetes.io/projected/4899bb2e-d091-4aa6-82f5-002af8bca9fe-kube-api-access-qv9qz\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:15.562984 master-0 kubenswrapper[13984]: I0312 12:40:15.562938 13984 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4899bb2e-d091-4aa6-82f5-002af8bca9fe-util\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:15.875343 master-0 kubenswrapper[13984]: I0312 12:40:15.875288 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6"] Mar 12 12:40:16.163319 master-0 kubenswrapper[13984]: I0312 12:40:16.163175 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" event={"ID":"4899bb2e-d091-4aa6-82f5-002af8bca9fe","Type":"ContainerDied","Data":"9dc2c24bacc9bdc68a4feb8c0da10d5b9d72937223aa7c747f0348087eafe6bf"} Mar 12 12:40:16.163319 master-0 kubenswrapper[13984]: I0312 12:40:16.163228 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dc2c24bacc9bdc68a4feb8c0da10d5b9d72937223aa7c747f0348087eafe6bf" Mar 12 12:40:16.163319 master-0 kubenswrapper[13984]: I0312 12:40:16.163293 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e59q7jv" Mar 12 12:40:16.165933 master-0 kubenswrapper[13984]: I0312 12:40:16.165900 13984 generic.go:334] "Generic (PLEG): container finished" podID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerID="4fb54910ec9bcf2dff8d86fb8a4a0e81a003b5832ebdc832768a10e945ebad10" exitCode=0 Mar 12 12:40:16.166164 master-0 kubenswrapper[13984]: I0312 12:40:16.166014 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" event={"ID":"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d","Type":"ContainerDied","Data":"4fb54910ec9bcf2dff8d86fb8a4a0e81a003b5832ebdc832768a10e945ebad10"} Mar 12 12:40:16.166250 master-0 kubenswrapper[13984]: I0312 12:40:16.166187 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" event={"ID":"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d","Type":"ContainerStarted","Data":"a8a6005fc9086610e2be701cb66be23359303853d05d2ac964030287b5deb98e"} Mar 12 12:40:17.901091 master-0 kubenswrapper[13984]: I0312 12:40:17.901039 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5"] Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: E0312 12:40:17.901350 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="pull" Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: I0312 12:40:17.901366 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="pull" Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: E0312 12:40:17.901378 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="extract" Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: I0312 12:40:17.901385 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="extract" Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: E0312 12:40:17.901405 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="util" Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: I0312 12:40:17.901415 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="util" Mar 12 12:40:17.901652 master-0 kubenswrapper[13984]: I0312 12:40:17.901621 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="4899bb2e-d091-4aa6-82f5-002af8bca9fe" containerName="extract" Mar 12 12:40:17.902219 master-0 kubenswrapper[13984]: I0312 12:40:17.902200 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" Mar 12 12:40:17.904342 master-0 kubenswrapper[13984]: I0312 12:40:17.904300 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 12 12:40:17.904526 master-0 kubenswrapper[13984]: I0312 12:40:17.904430 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 12 12:40:17.915407 master-0 kubenswrapper[13984]: I0312 12:40:17.915360 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5"] Mar 12 12:40:18.099563 master-0 kubenswrapper[13984]: I0312 12:40:18.098765 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf5q6\" (UniqueName: \"kubernetes.io/projected/2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf-kube-api-access-zf5q6\") pod \"nmstate-operator-796d4cfff4-l6mg5\" (UID: \"2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" Mar 12 12:40:18.181996 master-0 kubenswrapper[13984]: I0312 12:40:18.181940 13984 generic.go:334] "Generic (PLEG): container finished" podID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerID="494b102fa4cd57ac0877239c19223ad8e91220f247038d945c39686a28137ff6" exitCode=0 Mar 12 12:40:18.181996 master-0 kubenswrapper[13984]: I0312 12:40:18.181982 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" event={"ID":"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d","Type":"ContainerDied","Data":"494b102fa4cd57ac0877239c19223ad8e91220f247038d945c39686a28137ff6"} Mar 12 12:40:18.200401 master-0 kubenswrapper[13984]: I0312 12:40:18.200343 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf5q6\" (UniqueName: \"kubernetes.io/projected/2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf-kube-api-access-zf5q6\") pod \"nmstate-operator-796d4cfff4-l6mg5\" (UID: \"2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" Mar 12 12:40:18.218521 master-0 kubenswrapper[13984]: I0312 12:40:18.218451 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf5q6\" (UniqueName: \"kubernetes.io/projected/2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf-kube-api-access-zf5q6\") pod \"nmstate-operator-796d4cfff4-l6mg5\" (UID: \"2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" Mar 12 12:40:18.219116 master-0 kubenswrapper[13984]: I0312 12:40:18.219088 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" Mar 12 12:40:18.654037 master-0 kubenswrapper[13984]: I0312 12:40:18.653919 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5"] Mar 12 12:40:18.723612 master-0 kubenswrapper[13984]: W0312 12:40:18.723555 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c044b22_cc0c_4ec1_b0f6_a6f34016dfbf.slice/crio-06c5ae10d751d6c5361ba8918ad0ccc30e9c07906dc016767c8d5771f81f4959 WatchSource:0}: Error finding container 06c5ae10d751d6c5361ba8918ad0ccc30e9c07906dc016767c8d5771f81f4959: Status 404 returned error can't find the container with id 06c5ae10d751d6c5361ba8918ad0ccc30e9c07906dc016767c8d5771f81f4959 Mar 12 12:40:19.189614 master-0 kubenswrapper[13984]: I0312 12:40:19.189511 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" event={"ID":"2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf","Type":"ContainerStarted","Data":"06c5ae10d751d6c5361ba8918ad0ccc30e9c07906dc016767c8d5771f81f4959"} Mar 12 12:40:19.192659 master-0 kubenswrapper[13984]: I0312 12:40:19.192628 13984 generic.go:334] "Generic (PLEG): container finished" podID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerID="568d6e8b91e20695311983ecff06a36a5f240d55002bd5650c8e444308a3c994" exitCode=0 Mar 12 12:40:19.192778 master-0 kubenswrapper[13984]: I0312 12:40:19.192709 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" event={"ID":"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d","Type":"ContainerDied","Data":"568d6e8b91e20695311983ecff06a36a5f240d55002bd5650c8e444308a3c994"} Mar 12 12:40:20.518121 master-0 kubenswrapper[13984]: I0312 12:40:20.518062 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:20.556092 master-0 kubenswrapper[13984]: I0312 12:40:20.556046 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-bundle\") pod \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " Mar 12 12:40:20.556306 master-0 kubenswrapper[13984]: I0312 12:40:20.556115 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-util\") pod \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " Mar 12 12:40:20.556306 master-0 kubenswrapper[13984]: I0312 12:40:20.556235 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk6zp\" (UniqueName: \"kubernetes.io/projected/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-kube-api-access-hk6zp\") pod \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\" (UID: \"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d\") " Mar 12 12:40:20.560147 master-0 kubenswrapper[13984]: I0312 12:40:20.560105 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-kube-api-access-hk6zp" (OuterVolumeSpecName: "kube-api-access-hk6zp") pod "e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" (UID: "e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d"). InnerVolumeSpecName "kube-api-access-hk6zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:40:20.570569 master-0 kubenswrapper[13984]: I0312 12:40:20.568403 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-bundle" (OuterVolumeSpecName: "bundle") pod "e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" (UID: "e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:20.596744 master-0 kubenswrapper[13984]: I0312 12:40:20.596594 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-util" (OuterVolumeSpecName: "util") pod "e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" (UID: "e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:40:20.658431 master-0 kubenswrapper[13984]: I0312 12:40:20.658359 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk6zp\" (UniqueName: \"kubernetes.io/projected/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-kube-api-access-hk6zp\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:20.658431 master-0 kubenswrapper[13984]: I0312 12:40:20.658409 13984 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:20.658431 master-0 kubenswrapper[13984]: I0312 12:40:20.658419 13984 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d-util\") on node \"master-0\" DevicePath \"\"" Mar 12 12:40:21.218440 master-0 kubenswrapper[13984]: I0312 12:40:21.218385 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" event={"ID":"e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d","Type":"ContainerDied","Data":"a8a6005fc9086610e2be701cb66be23359303853d05d2ac964030287b5deb98e"} Mar 12 12:40:21.218440 master-0 kubenswrapper[13984]: I0312 12:40:21.218437 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8a6005fc9086610e2be701cb66be23359303853d05d2ac964030287b5deb98e" Mar 12 12:40:21.218712 master-0 kubenswrapper[13984]: I0312 12:40:21.218467 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f085x7d6" Mar 12 12:40:22.228160 master-0 kubenswrapper[13984]: I0312 12:40:22.228109 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" event={"ID":"2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf","Type":"ContainerStarted","Data":"0c97afbb579a0e6c56f5606f73ec01b341271cf4f784938eeba3f942d4493415"} Mar 12 12:40:22.257569 master-0 kubenswrapper[13984]: I0312 12:40:22.257453 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-l6mg5" podStartSLOduration=2.827144573 podStartE2EDuration="5.257425117s" podCreationTimestamp="2026-03-12 12:40:17 +0000 UTC" firstStartedPulling="2026-03-12 12:40:18.736676358 +0000 UTC m=+950.934691850" lastFinishedPulling="2026-03-12 12:40:21.166956892 +0000 UTC m=+953.364972394" observedRunningTime="2026-03-12 12:40:22.253218386 +0000 UTC m=+954.451233888" watchObservedRunningTime="2026-03-12 12:40:22.257425117 +0000 UTC m=+954.455440629" Mar 12 12:40:28.805581 master-0 kubenswrapper[13984]: I0312 12:40:28.802734 13984 scope.go:117] "RemoveContainer" containerID="bcd65b1e4c21449072ed272f87ee781da0da9fc877eb7847799dae3f064bf2e7" Mar 12 12:40:29.809587 master-0 kubenswrapper[13984]: I0312 12:40:29.809518 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9"] Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: E0312 12:40:29.809809 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="util" Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: I0312 12:40:29.809824 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="util" Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: E0312 12:40:29.809847 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="pull" Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: I0312 12:40:29.809852 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="pull" Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: E0312 12:40:29.809878 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="extract" Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: I0312 12:40:29.809884 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="extract" Mar 12 12:40:29.810167 master-0 kubenswrapper[13984]: I0312 12:40:29.810052 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8a1ea4d-5c97-4eec-bfc5-91fffdb4eb3d" containerName="extract" Mar 12 12:40:29.810630 master-0 kubenswrapper[13984]: I0312 12:40:29.810596 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:29.812967 master-0 kubenswrapper[13984]: I0312 12:40:29.812889 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 12 12:40:29.813200 master-0 kubenswrapper[13984]: I0312 12:40:29.813177 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 12 12:40:29.815839 master-0 kubenswrapper[13984]: I0312 12:40:29.815809 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdzf\" (UniqueName: \"kubernetes.io/projected/42186359-5baa-4625-a081-f226be1932e0-kube-api-access-wrdzf\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sww9\" (UID: \"42186359-5baa-4625-a081-f226be1932e0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:29.815946 master-0 kubenswrapper[13984]: I0312 12:40:29.815918 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42186359-5baa-4625-a081-f226be1932e0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sww9\" (UID: \"42186359-5baa-4625-a081-f226be1932e0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:29.831179 master-0 kubenswrapper[13984]: I0312 12:40:29.831128 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9"] Mar 12 12:40:29.917555 master-0 kubenswrapper[13984]: I0312 12:40:29.917316 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42186359-5baa-4625-a081-f226be1932e0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sww9\" (UID: \"42186359-5baa-4625-a081-f226be1932e0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:29.917555 master-0 kubenswrapper[13984]: I0312 12:40:29.917397 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdzf\" (UniqueName: \"kubernetes.io/projected/42186359-5baa-4625-a081-f226be1932e0-kube-api-access-wrdzf\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sww9\" (UID: \"42186359-5baa-4625-a081-f226be1932e0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:29.918232 master-0 kubenswrapper[13984]: I0312 12:40:29.918171 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/42186359-5baa-4625-a081-f226be1932e0-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sww9\" (UID: \"42186359-5baa-4625-a081-f226be1932e0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:29.942633 master-0 kubenswrapper[13984]: I0312 12:40:29.941948 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdzf\" (UniqueName: \"kubernetes.io/projected/42186359-5baa-4625-a081-f226be1932e0-kube-api-access-wrdzf\") pod \"cert-manager-operator-controller-manager-66c8bdd694-5sww9\" (UID: \"42186359-5baa-4625-a081-f226be1932e0\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:30.128526 master-0 kubenswrapper[13984]: I0312 12:40:30.128334 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" Mar 12 12:40:30.626289 master-0 kubenswrapper[13984]: I0312 12:40:30.626216 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9"] Mar 12 12:40:30.628700 master-0 kubenswrapper[13984]: W0312 12:40:30.628639 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42186359_5baa_4625_a081_f226be1932e0.slice/crio-16fd6ad0b5a63d220c3b6fecd525382d88d0f8d105b6bc3150929b9b12a5ad08 WatchSource:0}: Error finding container 16fd6ad0b5a63d220c3b6fecd525382d88d0f8d105b6bc3150929b9b12a5ad08: Status 404 returned error can't find the container with id 16fd6ad0b5a63d220c3b6fecd525382d88d0f8d105b6bc3150929b9b12a5ad08 Mar 12 12:40:31.291938 master-0 kubenswrapper[13984]: I0312 12:40:31.291872 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" event={"ID":"42186359-5baa-4625-a081-f226be1932e0","Type":"ContainerStarted","Data":"16fd6ad0b5a63d220c3b6fecd525382d88d0f8d105b6bc3150929b9b12a5ad08"} Mar 12 12:40:35.342013 master-0 kubenswrapper[13984]: I0312 12:40:35.341933 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" event={"ID":"42186359-5baa-4625-a081-f226be1932e0","Type":"ContainerStarted","Data":"5c228a89361eadbe184118a712d0b4696b5348fe1f3bdd5c72ae5d3358554dbd"} Mar 12 12:40:35.366776 master-0 kubenswrapper[13984]: I0312 12:40:35.366669 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-5sww9" podStartSLOduration=2.10248491 podStartE2EDuration="6.366650689s" podCreationTimestamp="2026-03-12 12:40:29 +0000 UTC" firstStartedPulling="2026-03-12 12:40:30.632773191 +0000 UTC m=+962.830788683" lastFinishedPulling="2026-03-12 12:40:34.89693897 +0000 UTC m=+967.094954462" observedRunningTime="2026-03-12 12:40:35.365740543 +0000 UTC m=+967.563756045" watchObservedRunningTime="2026-03-12 12:40:35.366650689 +0000 UTC m=+967.564666181" Mar 12 12:40:35.527083 master-0 kubenswrapper[13984]: I0312 12:40:35.526992 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5"] Mar 12 12:40:35.528139 master-0 kubenswrapper[13984]: I0312 12:40:35.528102 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.532245 master-0 kubenswrapper[13984]: I0312 12:40:35.532206 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 12 12:40:35.532465 master-0 kubenswrapper[13984]: I0312 12:40:35.532442 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 12 12:40:35.532623 master-0 kubenswrapper[13984]: I0312 12:40:35.532602 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 12 12:40:35.533600 master-0 kubenswrapper[13984]: I0312 12:40:35.533578 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 12 12:40:35.558051 master-0 kubenswrapper[13984]: I0312 12:40:35.557983 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5"] Mar 12 12:40:35.616130 master-0 kubenswrapper[13984]: I0312 12:40:35.615930 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a16471dc-b7aa-43ed-876d-895680bd9539-apiservice-cert\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.616130 master-0 kubenswrapper[13984]: I0312 12:40:35.616024 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w2qm\" (UniqueName: \"kubernetes.io/projected/a16471dc-b7aa-43ed-876d-895680bd9539-kube-api-access-9w2qm\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.616130 master-0 kubenswrapper[13984]: I0312 12:40:35.616080 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a16471dc-b7aa-43ed-876d-895680bd9539-webhook-cert\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.723661 master-0 kubenswrapper[13984]: I0312 12:40:35.722224 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w2qm\" (UniqueName: \"kubernetes.io/projected/a16471dc-b7aa-43ed-876d-895680bd9539-kube-api-access-9w2qm\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.723661 master-0 kubenswrapper[13984]: I0312 12:40:35.722319 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a16471dc-b7aa-43ed-876d-895680bd9539-webhook-cert\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.723661 master-0 kubenswrapper[13984]: I0312 12:40:35.722358 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a16471dc-b7aa-43ed-876d-895680bd9539-apiservice-cert\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.726302 master-0 kubenswrapper[13984]: I0312 12:40:35.726257 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a16471dc-b7aa-43ed-876d-895680bd9539-webhook-cert\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.745626 master-0 kubenswrapper[13984]: I0312 12:40:35.745089 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a16471dc-b7aa-43ed-876d-895680bd9539-apiservice-cert\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.745626 master-0 kubenswrapper[13984]: I0312 12:40:35.745163 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w2qm\" (UniqueName: \"kubernetes.io/projected/a16471dc-b7aa-43ed-876d-895680bd9539-kube-api-access-9w2qm\") pod \"metallb-operator-controller-manager-787f7977d6-w89p5\" (UID: \"a16471dc-b7aa-43ed-876d-895680bd9539\") " pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.843448 master-0 kubenswrapper[13984]: I0312 12:40:35.843383 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:35.846981 master-0 kubenswrapper[13984]: I0312 12:40:35.846921 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm"] Mar 12 12:40:35.847877 master-0 kubenswrapper[13984]: I0312 12:40:35.847846 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:35.854563 master-0 kubenswrapper[13984]: I0312 12:40:35.851910 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 12:40:35.854563 master-0 kubenswrapper[13984]: I0312 12:40:35.852758 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 12 12:40:35.870314 master-0 kubenswrapper[13984]: I0312 12:40:35.870165 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm"] Mar 12 12:40:35.925677 master-0 kubenswrapper[13984]: I0312 12:40:35.925620 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-webhook-cert\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:35.925906 master-0 kubenswrapper[13984]: I0312 12:40:35.925741 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-apiservice-cert\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:35.925906 master-0 kubenswrapper[13984]: I0312 12:40:35.925795 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xf4\" (UniqueName: \"kubernetes.io/projected/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-kube-api-access-p6xf4\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.038505 master-0 kubenswrapper[13984]: I0312 12:40:36.038430 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-webhook-cert\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.038746 master-0 kubenswrapper[13984]: I0312 12:40:36.038564 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-apiservice-cert\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.038746 master-0 kubenswrapper[13984]: I0312 12:40:36.038621 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xf4\" (UniqueName: \"kubernetes.io/projected/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-kube-api-access-p6xf4\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.049502 master-0 kubenswrapper[13984]: I0312 12:40:36.044415 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-webhook-cert\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.057529 master-0 kubenswrapper[13984]: I0312 12:40:36.049826 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-apiservice-cert\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.057529 master-0 kubenswrapper[13984]: I0312 12:40:36.056793 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xf4\" (UniqueName: \"kubernetes.io/projected/c0d7dfa9-7424-491d-b0a0-3543e41efe0b-kube-api-access-p6xf4\") pod \"metallb-operator-webhook-server-56d7584dd9-8sxrm\" (UID: \"c0d7dfa9-7424-491d-b0a0-3543e41efe0b\") " pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.196554 master-0 kubenswrapper[13984]: I0312 12:40:36.196110 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:36.342197 master-0 kubenswrapper[13984]: I0312 12:40:36.340889 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5"] Mar 12 12:40:36.712262 master-0 kubenswrapper[13984]: I0312 12:40:36.712137 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm"] Mar 12 12:40:37.438306 master-0 kubenswrapper[13984]: I0312 12:40:37.438231 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" event={"ID":"a16471dc-b7aa-43ed-876d-895680bd9539","Type":"ContainerStarted","Data":"be1739f96ae3c733e829f78ae23ebec630d2689d67fce842b3e6db42773bba8f"} Mar 12 12:40:37.456620 master-0 kubenswrapper[13984]: I0312 12:40:37.455830 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" event={"ID":"c0d7dfa9-7424-491d-b0a0-3543e41efe0b","Type":"ContainerStarted","Data":"7c77f138a29a0fe9cff4d8b6cc230bf10a9be7f7fbe28e6181086890a3c75a7d"} Mar 12 12:40:39.027501 master-0 kubenswrapper[13984]: I0312 12:40:39.026977 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wnrrg"] Mar 12 12:40:39.032072 master-0 kubenswrapper[13984]: I0312 12:40:39.029060 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.038504 master-0 kubenswrapper[13984]: I0312 12:40:39.034754 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 12 12:40:39.038504 master-0 kubenswrapper[13984]: I0312 12:40:39.035017 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 12 12:40:39.062306 master-0 kubenswrapper[13984]: I0312 12:40:39.059797 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wnrrg"] Mar 12 12:40:39.142410 master-0 kubenswrapper[13984]: I0312 12:40:39.141114 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9qs4\" (UniqueName: \"kubernetes.io/projected/454666d2-7428-4205-9071-a8a8f1af8f35-kube-api-access-g9qs4\") pod \"cert-manager-webhook-6888856db4-wnrrg\" (UID: \"454666d2-7428-4205-9071-a8a8f1af8f35\") " pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.142410 master-0 kubenswrapper[13984]: I0312 12:40:39.141182 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/454666d2-7428-4205-9071-a8a8f1af8f35-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wnrrg\" (UID: \"454666d2-7428-4205-9071-a8a8f1af8f35\") " pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.243481 master-0 kubenswrapper[13984]: I0312 12:40:39.243372 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9qs4\" (UniqueName: \"kubernetes.io/projected/454666d2-7428-4205-9071-a8a8f1af8f35-kube-api-access-g9qs4\") pod \"cert-manager-webhook-6888856db4-wnrrg\" (UID: \"454666d2-7428-4205-9071-a8a8f1af8f35\") " pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.243709 master-0 kubenswrapper[13984]: I0312 12:40:39.243512 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/454666d2-7428-4205-9071-a8a8f1af8f35-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wnrrg\" (UID: \"454666d2-7428-4205-9071-a8a8f1af8f35\") " pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.296266 master-0 kubenswrapper[13984]: I0312 12:40:39.296143 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9qs4\" (UniqueName: \"kubernetes.io/projected/454666d2-7428-4205-9071-a8a8f1af8f35-kube-api-access-g9qs4\") pod \"cert-manager-webhook-6888856db4-wnrrg\" (UID: \"454666d2-7428-4205-9071-a8a8f1af8f35\") " pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.303298 master-0 kubenswrapper[13984]: I0312 12:40:39.303221 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/454666d2-7428-4205-9071-a8a8f1af8f35-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-wnrrg\" (UID: \"454666d2-7428-4205-9071-a8a8f1af8f35\") " pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:39.358611 master-0 kubenswrapper[13984]: I0312 12:40:39.358551 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:40.058609 master-0 kubenswrapper[13984]: I0312 12:40:40.058524 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-wnrrg"] Mar 12 12:40:40.516112 master-0 kubenswrapper[13984]: I0312 12:40:40.516048 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" event={"ID":"454666d2-7428-4205-9071-a8a8f1af8f35","Type":"ContainerStarted","Data":"78eea2fc45c51ce77bfe7ef7ea1084b4ad02ec4b69f1b628da2c03501ac11661"} Mar 12 12:40:41.370825 master-0 kubenswrapper[13984]: I0312 12:40:41.364437 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-nznlb"] Mar 12 12:40:41.370825 master-0 kubenswrapper[13984]: I0312 12:40:41.365345 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.391514 master-0 kubenswrapper[13984]: I0312 12:40:41.388979 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-nznlb"] Mar 12 12:40:41.496499 master-0 kubenswrapper[13984]: I0312 12:40:41.496128 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea299a1d-4df3-4f4a-985f-35922cd878f4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-nznlb\" (UID: \"ea299a1d-4df3-4f4a-985f-35922cd878f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.496499 master-0 kubenswrapper[13984]: I0312 12:40:41.496196 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rvf2\" (UniqueName: \"kubernetes.io/projected/ea299a1d-4df3-4f4a-985f-35922cd878f4-kube-api-access-6rvf2\") pod \"cert-manager-cainjector-5545bd876-nznlb\" (UID: \"ea299a1d-4df3-4f4a-985f-35922cd878f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.599508 master-0 kubenswrapper[13984]: I0312 12:40:41.599326 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea299a1d-4df3-4f4a-985f-35922cd878f4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-nznlb\" (UID: \"ea299a1d-4df3-4f4a-985f-35922cd878f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.599508 master-0 kubenswrapper[13984]: I0312 12:40:41.599405 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rvf2\" (UniqueName: \"kubernetes.io/projected/ea299a1d-4df3-4f4a-985f-35922cd878f4-kube-api-access-6rvf2\") pod \"cert-manager-cainjector-5545bd876-nznlb\" (UID: \"ea299a1d-4df3-4f4a-985f-35922cd878f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.696564 master-0 kubenswrapper[13984]: I0312 12:40:41.690809 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rvf2\" (UniqueName: \"kubernetes.io/projected/ea299a1d-4df3-4f4a-985f-35922cd878f4-kube-api-access-6rvf2\") pod \"cert-manager-cainjector-5545bd876-nznlb\" (UID: \"ea299a1d-4df3-4f4a-985f-35922cd878f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.697534 master-0 kubenswrapper[13984]: I0312 12:40:41.697233 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ea299a1d-4df3-4f4a-985f-35922cd878f4-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-nznlb\" (UID: \"ea299a1d-4df3-4f4a-985f-35922cd878f4\") " pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:41.996577 master-0 kubenswrapper[13984]: I0312 12:40:41.994676 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" Mar 12 12:40:45.869224 master-0 kubenswrapper[13984]: I0312 12:40:45.869150 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-nznlb"] Mar 12 12:40:48.005378 master-0 kubenswrapper[13984]: W0312 12:40:48.005336 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea299a1d_4df3_4f4a_985f_35922cd878f4.slice/crio-71879894f92828293021cadcaf080f13e0e6b46217cef2da038d2216594b74fd WatchSource:0}: Error finding container 71879894f92828293021cadcaf080f13e0e6b46217cef2da038d2216594b74fd: Status 404 returned error can't find the container with id 71879894f92828293021cadcaf080f13e0e6b46217cef2da038d2216594b74fd Mar 12 12:40:48.632938 master-0 kubenswrapper[13984]: I0312 12:40:48.632812 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" event={"ID":"a16471dc-b7aa-43ed-876d-895680bd9539","Type":"ContainerStarted","Data":"e370255e8958de07ad333d065b743cddb8e3e4e4a97620a49275c8de6f313fc4"} Mar 12 12:40:48.632938 master-0 kubenswrapper[13984]: I0312 12:40:48.632883 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:40:48.633865 master-0 kubenswrapper[13984]: I0312 12:40:48.633832 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" event={"ID":"ea299a1d-4df3-4f4a-985f-35922cd878f4","Type":"ContainerStarted","Data":"71879894f92828293021cadcaf080f13e0e6b46217cef2da038d2216594b74fd"} Mar 12 12:40:48.635274 master-0 kubenswrapper[13984]: I0312 12:40:48.635235 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" event={"ID":"c0d7dfa9-7424-491d-b0a0-3543e41efe0b","Type":"ContainerStarted","Data":"57e633923548a99375e4a1b336a5eb6596dd362516ec558885c0d5325b49d730"} Mar 12 12:40:48.635726 master-0 kubenswrapper[13984]: I0312 12:40:48.635706 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:40:48.660496 master-0 kubenswrapper[13984]: I0312 12:40:48.660408 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" podStartSLOduration=1.9890242200000001 podStartE2EDuration="13.66039267s" podCreationTimestamp="2026-03-12 12:40:35 +0000 UTC" firstStartedPulling="2026-03-12 12:40:36.407337395 +0000 UTC m=+968.605352887" lastFinishedPulling="2026-03-12 12:40:48.078705835 +0000 UTC m=+980.276721337" observedRunningTime="2026-03-12 12:40:48.652561035 +0000 UTC m=+980.850576537" watchObservedRunningTime="2026-03-12 12:40:48.66039267 +0000 UTC m=+980.858408182" Mar 12 12:40:48.826437 master-0 kubenswrapper[13984]: I0312 12:40:48.826341 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" podStartSLOduration=2.4436136250000002 podStartE2EDuration="13.826317845s" podCreationTimestamp="2026-03-12 12:40:35 +0000 UTC" firstStartedPulling="2026-03-12 12:40:36.725643226 +0000 UTC m=+968.923658718" lastFinishedPulling="2026-03-12 12:40:48.108347446 +0000 UTC m=+980.306362938" observedRunningTime="2026-03-12 12:40:48.679080886 +0000 UTC m=+980.877096378" watchObservedRunningTime="2026-03-12 12:40:48.826317845 +0000 UTC m=+981.024333337" Mar 12 12:40:48.845519 master-0 kubenswrapper[13984]: I0312 12:40:48.845464 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh"] Mar 12 12:40:48.855971 master-0 kubenswrapper[13984]: I0312 12:40:48.855930 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" Mar 12 12:40:48.862954 master-0 kubenswrapper[13984]: I0312 12:40:48.860014 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 12 12:40:48.862954 master-0 kubenswrapper[13984]: I0312 12:40:48.860235 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 12 12:40:48.924594 master-0 kubenswrapper[13984]: I0312 12:40:48.913399 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh"] Mar 12 12:40:48.982888 master-0 kubenswrapper[13984]: I0312 12:40:48.982840 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlvp2\" (UniqueName: \"kubernetes.io/projected/e9ba1126-67c1-4557-8a58-7931567ff140-kube-api-access-vlvp2\") pod \"obo-prometheus-operator-68bc856cb9-j54nh\" (UID: \"e9ba1126-67c1-4557-8a58-7931567ff140\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" Mar 12 12:40:48.984143 master-0 kubenswrapper[13984]: I0312 12:40:48.984103 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b"] Mar 12 12:40:48.985063 master-0 kubenswrapper[13984]: I0312 12:40:48.984999 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:48.999811 master-0 kubenswrapper[13984]: I0312 12:40:48.999778 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 12 12:40:49.013692 master-0 kubenswrapper[13984]: I0312 12:40:49.013376 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm"] Mar 12 12:40:49.015055 master-0 kubenswrapper[13984]: I0312 12:40:49.014723 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.028956 master-0 kubenswrapper[13984]: I0312 12:40:49.028879 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b"] Mar 12 12:40:49.057505 master-0 kubenswrapper[13984]: I0312 12:40:49.053100 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm"] Mar 12 12:40:49.084775 master-0 kubenswrapper[13984]: I0312 12:40:49.084720 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab2dd538-a035-4a9c-8026-604a74e84fa0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm\" (UID: \"ab2dd538-a035-4a9c-8026-604a74e84fa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.085057 master-0 kubenswrapper[13984]: I0312 12:40:49.084793 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d09166ec-2efd-484a-9177-7edd92703c33-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-cf88b\" (UID: \"d09166ec-2efd-484a-9177-7edd92703c33\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.085057 master-0 kubenswrapper[13984]: I0312 12:40:49.084961 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab2dd538-a035-4a9c-8026-604a74e84fa0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm\" (UID: \"ab2dd538-a035-4a9c-8026-604a74e84fa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.085312 master-0 kubenswrapper[13984]: I0312 12:40:49.085278 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlvp2\" (UniqueName: \"kubernetes.io/projected/e9ba1126-67c1-4557-8a58-7931567ff140-kube-api-access-vlvp2\") pod \"obo-prometheus-operator-68bc856cb9-j54nh\" (UID: \"e9ba1126-67c1-4557-8a58-7931567ff140\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" Mar 12 12:40:49.085541 master-0 kubenswrapper[13984]: I0312 12:40:49.085522 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d09166ec-2efd-484a-9177-7edd92703c33-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-cf88b\" (UID: \"d09166ec-2efd-484a-9177-7edd92703c33\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.105944 master-0 kubenswrapper[13984]: I0312 12:40:49.105885 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlvp2\" (UniqueName: \"kubernetes.io/projected/e9ba1126-67c1-4557-8a58-7931567ff140-kube-api-access-vlvp2\") pod \"obo-prometheus-operator-68bc856cb9-j54nh\" (UID: \"e9ba1126-67c1-4557-8a58-7931567ff140\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" Mar 12 12:40:49.187357 master-0 kubenswrapper[13984]: I0312 12:40:49.186828 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d09166ec-2efd-484a-9177-7edd92703c33-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-cf88b\" (UID: \"d09166ec-2efd-484a-9177-7edd92703c33\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.187357 master-0 kubenswrapper[13984]: I0312 12:40:49.186917 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab2dd538-a035-4a9c-8026-604a74e84fa0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm\" (UID: \"ab2dd538-a035-4a9c-8026-604a74e84fa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.187357 master-0 kubenswrapper[13984]: I0312 12:40:49.186970 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d09166ec-2efd-484a-9177-7edd92703c33-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-cf88b\" (UID: \"d09166ec-2efd-484a-9177-7edd92703c33\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.187357 master-0 kubenswrapper[13984]: I0312 12:40:49.187010 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab2dd538-a035-4a9c-8026-604a74e84fa0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm\" (UID: \"ab2dd538-a035-4a9c-8026-604a74e84fa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.191455 master-0 kubenswrapper[13984]: I0312 12:40:49.191414 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ab2dd538-a035-4a9c-8026-604a74e84fa0-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm\" (UID: \"ab2dd538-a035-4a9c-8026-604a74e84fa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.191928 master-0 kubenswrapper[13984]: I0312 12:40:49.191905 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ab2dd538-a035-4a9c-8026-604a74e84fa0-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm\" (UID: \"ab2dd538-a035-4a9c-8026-604a74e84fa0\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.201502 master-0 kubenswrapper[13984]: I0312 12:40:49.197536 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d09166ec-2efd-484a-9177-7edd92703c33-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-cf88b\" (UID: \"d09166ec-2efd-484a-9177-7edd92703c33\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.215298 master-0 kubenswrapper[13984]: I0312 12:40:49.215249 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d09166ec-2efd-484a-9177-7edd92703c33-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-5f7678f964-cf88b\" (UID: \"d09166ec-2efd-484a-9177-7edd92703c33\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.232863 master-0 kubenswrapper[13984]: I0312 12:40:49.232637 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-89rvv"] Mar 12 12:40:49.236156 master-0 kubenswrapper[13984]: I0312 12:40:49.233678 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.237125 master-0 kubenswrapper[13984]: I0312 12:40:49.236670 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 12 12:40:49.250076 master-0 kubenswrapper[13984]: I0312 12:40:49.250005 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" Mar 12 12:40:49.278427 master-0 kubenswrapper[13984]: I0312 12:40:49.278317 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-89rvv"] Mar 12 12:40:49.288244 master-0 kubenswrapper[13984]: I0312 12:40:49.288179 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/39c8fb0a-1bc4-4d5c-a8c8-72816605835f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-89rvv\" (UID: \"39c8fb0a-1bc4-4d5c-a8c8-72816605835f\") " pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.288244 master-0 kubenswrapper[13984]: I0312 12:40:49.288250 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vt8z\" (UniqueName: \"kubernetes.io/projected/39c8fb0a-1bc4-4d5c-a8c8-72816605835f-kube-api-access-5vt8z\") pod \"observability-operator-59bdc8b94-89rvv\" (UID: \"39c8fb0a-1bc4-4d5c-a8c8-72816605835f\") " pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.364215 master-0 kubenswrapper[13984]: I0312 12:40:49.364175 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" Mar 12 12:40:49.374072 master-0 kubenswrapper[13984]: I0312 12:40:49.373962 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" Mar 12 12:40:49.392062 master-0 kubenswrapper[13984]: I0312 12:40:49.390637 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/39c8fb0a-1bc4-4d5c-a8c8-72816605835f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-89rvv\" (UID: \"39c8fb0a-1bc4-4d5c-a8c8-72816605835f\") " pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.392062 master-0 kubenswrapper[13984]: I0312 12:40:49.390761 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vt8z\" (UniqueName: \"kubernetes.io/projected/39c8fb0a-1bc4-4d5c-a8c8-72816605835f-kube-api-access-5vt8z\") pod \"observability-operator-59bdc8b94-89rvv\" (UID: \"39c8fb0a-1bc4-4d5c-a8c8-72816605835f\") " pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.398422 master-0 kubenswrapper[13984]: I0312 12:40:49.395001 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/39c8fb0a-1bc4-4d5c-a8c8-72816605835f-observability-operator-tls\") pod \"observability-operator-59bdc8b94-89rvv\" (UID: \"39c8fb0a-1bc4-4d5c-a8c8-72816605835f\") " pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.407509 master-0 kubenswrapper[13984]: I0312 12:40:49.405437 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-7p9wl"] Mar 12 12:40:49.407509 master-0 kubenswrapper[13984]: I0312 12:40:49.406463 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.416974 master-0 kubenswrapper[13984]: I0312 12:40:49.416927 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-7p9wl"] Mar 12 12:40:49.422182 master-0 kubenswrapper[13984]: I0312 12:40:49.422138 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vt8z\" (UniqueName: \"kubernetes.io/projected/39c8fb0a-1bc4-4d5c-a8c8-72816605835f-kube-api-access-5vt8z\") pod \"observability-operator-59bdc8b94-89rvv\" (UID: \"39c8fb0a-1bc4-4d5c-a8c8-72816605835f\") " pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.492308 master-0 kubenswrapper[13984]: I0312 12:40:49.492243 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/05ee168b-a67d-4370-86f3-f13737758499-openshift-service-ca\") pod \"perses-operator-5bf474d74f-7p9wl\" (UID: \"05ee168b-a67d-4370-86f3-f13737758499\") " pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.492629 master-0 kubenswrapper[13984]: I0312 12:40:49.492375 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw265\" (UniqueName: \"kubernetes.io/projected/05ee168b-a67d-4370-86f3-f13737758499-kube-api-access-tw265\") pod \"perses-operator-5bf474d74f-7p9wl\" (UID: \"05ee168b-a67d-4370-86f3-f13737758499\") " pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.568045 master-0 kubenswrapper[13984]: I0312 12:40:49.567998 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:40:49.595338 master-0 kubenswrapper[13984]: I0312 12:40:49.595197 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/05ee168b-a67d-4370-86f3-f13737758499-openshift-service-ca\") pod \"perses-operator-5bf474d74f-7p9wl\" (UID: \"05ee168b-a67d-4370-86f3-f13737758499\") " pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.595338 master-0 kubenswrapper[13984]: I0312 12:40:49.595306 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw265\" (UniqueName: \"kubernetes.io/projected/05ee168b-a67d-4370-86f3-f13737758499-kube-api-access-tw265\") pod \"perses-operator-5bf474d74f-7p9wl\" (UID: \"05ee168b-a67d-4370-86f3-f13737758499\") " pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.596820 master-0 kubenswrapper[13984]: I0312 12:40:49.596687 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/05ee168b-a67d-4370-86f3-f13737758499-openshift-service-ca\") pod \"perses-operator-5bf474d74f-7p9wl\" (UID: \"05ee168b-a67d-4370-86f3-f13737758499\") " pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.621130 master-0 kubenswrapper[13984]: I0312 12:40:49.621078 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw265\" (UniqueName: \"kubernetes.io/projected/05ee168b-a67d-4370-86f3-f13737758499-kube-api-access-tw265\") pod \"perses-operator-5bf474d74f-7p9wl\" (UID: \"05ee168b-a67d-4370-86f3-f13737758499\") " pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:49.816684 master-0 kubenswrapper[13984]: I0312 12:40:49.815816 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:40:50.311585 master-0 kubenswrapper[13984]: I0312 12:40:50.311515 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-r75qh"] Mar 12 12:40:50.313092 master-0 kubenswrapper[13984]: I0312 12:40:50.313061 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.324455 master-0 kubenswrapper[13984]: I0312 12:40:50.323590 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-r75qh"] Mar 12 12:40:50.414268 master-0 kubenswrapper[13984]: I0312 12:40:50.414201 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nz6gd\" (UniqueName: \"kubernetes.io/projected/5140e444-2351-4e05-a390-def937eb7c93-kube-api-access-nz6gd\") pod \"cert-manager-545d4d4674-r75qh\" (UID: \"5140e444-2351-4e05-a390-def937eb7c93\") " pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.414487 master-0 kubenswrapper[13984]: I0312 12:40:50.414360 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5140e444-2351-4e05-a390-def937eb7c93-bound-sa-token\") pod \"cert-manager-545d4d4674-r75qh\" (UID: \"5140e444-2351-4e05-a390-def937eb7c93\") " pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.515835 master-0 kubenswrapper[13984]: I0312 12:40:50.515782 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5140e444-2351-4e05-a390-def937eb7c93-bound-sa-token\") pod \"cert-manager-545d4d4674-r75qh\" (UID: \"5140e444-2351-4e05-a390-def937eb7c93\") " pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.516048 master-0 kubenswrapper[13984]: I0312 12:40:50.515877 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nz6gd\" (UniqueName: \"kubernetes.io/projected/5140e444-2351-4e05-a390-def937eb7c93-kube-api-access-nz6gd\") pod \"cert-manager-545d4d4674-r75qh\" (UID: \"5140e444-2351-4e05-a390-def937eb7c93\") " pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.531966 master-0 kubenswrapper[13984]: I0312 12:40:50.531366 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nz6gd\" (UniqueName: \"kubernetes.io/projected/5140e444-2351-4e05-a390-def937eb7c93-kube-api-access-nz6gd\") pod \"cert-manager-545d4d4674-r75qh\" (UID: \"5140e444-2351-4e05-a390-def937eb7c93\") " pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.539520 master-0 kubenswrapper[13984]: I0312 12:40:50.539454 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5140e444-2351-4e05-a390-def937eb7c93-bound-sa-token\") pod \"cert-manager-545d4d4674-r75qh\" (UID: \"5140e444-2351-4e05-a390-def937eb7c93\") " pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:50.644636 master-0 kubenswrapper[13984]: I0312 12:40:50.643220 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-r75qh" Mar 12 12:40:51.687736 master-0 kubenswrapper[13984]: I0312 12:40:51.687644 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" event={"ID":"ea299a1d-4df3-4f4a-985f-35922cd878f4","Type":"ContainerStarted","Data":"c167585dbcb03bb5d5c8d6368202e2ee9cd2c44399bf09171ab70ee9931d9d38"} Mar 12 12:40:51.698240 master-0 kubenswrapper[13984]: I0312 12:40:51.698171 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b"] Mar 12 12:40:51.705934 master-0 kubenswrapper[13984]: I0312 12:40:51.705868 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" event={"ID":"454666d2-7428-4205-9071-a8a8f1af8f35","Type":"ContainerStarted","Data":"8a16ea864d6873bfe12099aa41b296e9cb69eda583862823bf171455a84320c0"} Mar 12 12:40:51.707506 master-0 kubenswrapper[13984]: I0312 12:40:51.706763 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:40:51.710954 master-0 kubenswrapper[13984]: I0312 12:40:51.709038 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-nznlb" podStartSLOduration=7.415455086 podStartE2EDuration="10.709030281s" podCreationTimestamp="2026-03-12 12:40:41 +0000 UTC" firstStartedPulling="2026-03-12 12:40:48.065703111 +0000 UTC m=+980.263718603" lastFinishedPulling="2026-03-12 12:40:51.359278306 +0000 UTC m=+983.557293798" observedRunningTime="2026-03-12 12:40:51.708414433 +0000 UTC m=+983.906429925" watchObservedRunningTime="2026-03-12 12:40:51.709030281 +0000 UTC m=+983.907045773" Mar 12 12:40:51.710954 master-0 kubenswrapper[13984]: W0312 12:40:51.709544 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd09166ec_2efd_484a_9177_7edd92703c33.slice/crio-34ba1589c4b485be1bbca87e0be508692b3c16dd06d4c4db3c5364cbf182ff2c WatchSource:0}: Error finding container 34ba1589c4b485be1bbca87e0be508692b3c16dd06d4c4db3c5364cbf182ff2c: Status 404 returned error can't find the container with id 34ba1589c4b485be1bbca87e0be508692b3c16dd06d4c4db3c5364cbf182ff2c Mar 12 12:40:52.080771 master-0 kubenswrapper[13984]: W0312 12:40:52.080732 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9ba1126_67c1_4557_8a58_7931567ff140.slice/crio-85729fbde365b0de6dd3ecc6ea45544cf78f1ac208a505adef4b411e76cb3aca WatchSource:0}: Error finding container 85729fbde365b0de6dd3ecc6ea45544cf78f1ac208a505adef4b411e76cb3aca: Status 404 returned error can't find the container with id 85729fbde365b0de6dd3ecc6ea45544cf78f1ac208a505adef4b411e76cb3aca Mar 12 12:40:52.081865 master-0 kubenswrapper[13984]: I0312 12:40:52.081659 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" podStartSLOduration=2.8407638840000002 podStartE2EDuration="14.081462956s" podCreationTimestamp="2026-03-12 12:40:38 +0000 UTC" firstStartedPulling="2026-03-12 12:40:40.071302417 +0000 UTC m=+972.269317909" lastFinishedPulling="2026-03-12 12:40:51.312001489 +0000 UTC m=+983.510016981" observedRunningTime="2026-03-12 12:40:51.74172542 +0000 UTC m=+983.939740912" watchObservedRunningTime="2026-03-12 12:40:52.081462956 +0000 UTC m=+984.279478448" Mar 12 12:40:52.083839 master-0 kubenswrapper[13984]: I0312 12:40:52.083802 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh"] Mar 12 12:40:52.108414 master-0 kubenswrapper[13984]: W0312 12:40:52.108357 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab2dd538_a035_4a9c_8026_604a74e84fa0.slice/crio-d201d4ca5f52d8704ad0e1542ab9a63f272e58cf58e199403866dd97145338a2 WatchSource:0}: Error finding container d201d4ca5f52d8704ad0e1542ab9a63f272e58cf58e199403866dd97145338a2: Status 404 returned error can't find the container with id d201d4ca5f52d8704ad0e1542ab9a63f272e58cf58e199403866dd97145338a2 Mar 12 12:40:52.117866 master-0 kubenswrapper[13984]: W0312 12:40:52.114592 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39c8fb0a_1bc4_4d5c_a8c8_72816605835f.slice/crio-030520f9547a7f5c63850e519d640a47fad509b8367decd8f52155f1c3a8e8cc WatchSource:0}: Error finding container 030520f9547a7f5c63850e519d640a47fad509b8367decd8f52155f1c3a8e8cc: Status 404 returned error can't find the container with id 030520f9547a7f5c63850e519d640a47fad509b8367decd8f52155f1c3a8e8cc Mar 12 12:40:52.117866 master-0 kubenswrapper[13984]: I0312 12:40:52.116852 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-r75qh"] Mar 12 12:40:52.129786 master-0 kubenswrapper[13984]: I0312 12:40:52.129737 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-89rvv"] Mar 12 12:40:52.138218 master-0 kubenswrapper[13984]: I0312 12:40:52.138135 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm"] Mar 12 12:40:52.185179 master-0 kubenswrapper[13984]: I0312 12:40:52.185139 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-7p9wl"] Mar 12 12:40:52.186371 master-0 kubenswrapper[13984]: W0312 12:40:52.186346 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05ee168b_a67d_4370_86f3_f13737758499.slice/crio-b25b265e94524d714f583ca068c538a60f80e62c539399edc01ff1e522485919 WatchSource:0}: Error finding container b25b265e94524d714f583ca068c538a60f80e62c539399edc01ff1e522485919: Status 404 returned error can't find the container with id b25b265e94524d714f583ca068c538a60f80e62c539399edc01ff1e522485919 Mar 12 12:40:52.715160 master-0 kubenswrapper[13984]: I0312 12:40:52.715080 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" event={"ID":"ab2dd538-a035-4a9c-8026-604a74e84fa0","Type":"ContainerStarted","Data":"d201d4ca5f52d8704ad0e1542ab9a63f272e58cf58e199403866dd97145338a2"} Mar 12 12:40:52.716571 master-0 kubenswrapper[13984]: I0312 12:40:52.716524 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-r75qh" event={"ID":"5140e444-2351-4e05-a390-def937eb7c93","Type":"ContainerStarted","Data":"3d9fb8d6eebdc86fb51e9c6bddc7c9ffbc856247cef5832a44930acec3ca8433"} Mar 12 12:40:52.716671 master-0 kubenswrapper[13984]: I0312 12:40:52.716580 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-r75qh" event={"ID":"5140e444-2351-4e05-a390-def937eb7c93","Type":"ContainerStarted","Data":"9d6565a7fe503923149aa304bb353747df14a4afc48db94e94ccfeddc209236f"} Mar 12 12:40:52.718051 master-0 kubenswrapper[13984]: I0312 12:40:52.717993 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" event={"ID":"39c8fb0a-1bc4-4d5c-a8c8-72816605835f","Type":"ContainerStarted","Data":"030520f9547a7f5c63850e519d640a47fad509b8367decd8f52155f1c3a8e8cc"} Mar 12 12:40:52.719517 master-0 kubenswrapper[13984]: I0312 12:40:52.719473 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" event={"ID":"d09166ec-2efd-484a-9177-7edd92703c33","Type":"ContainerStarted","Data":"34ba1589c4b485be1bbca87e0be508692b3c16dd06d4c4db3c5364cbf182ff2c"} Mar 12 12:40:52.721112 master-0 kubenswrapper[13984]: I0312 12:40:52.721076 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" event={"ID":"e9ba1126-67c1-4557-8a58-7931567ff140","Type":"ContainerStarted","Data":"85729fbde365b0de6dd3ecc6ea45544cf78f1ac208a505adef4b411e76cb3aca"} Mar 12 12:40:52.723446 master-0 kubenswrapper[13984]: I0312 12:40:52.723405 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" event={"ID":"05ee168b-a67d-4370-86f3-f13737758499","Type":"ContainerStarted","Data":"b25b265e94524d714f583ca068c538a60f80e62c539399edc01ff1e522485919"} Mar 12 12:40:52.741188 master-0 kubenswrapper[13984]: I0312 12:40:52.741085 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-r75qh" podStartSLOduration=2.741062139 podStartE2EDuration="2.741062139s" podCreationTimestamp="2026-03-12 12:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:40:52.734666565 +0000 UTC m=+984.932682067" watchObservedRunningTime="2026-03-12 12:40:52.741062139 +0000 UTC m=+984.939077641" Mar 12 12:40:59.361556 master-0 kubenswrapper[13984]: I0312 12:40:59.361503 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-wnrrg" Mar 12 12:41:02.816068 master-0 kubenswrapper[13984]: I0312 12:41:02.815983 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" event={"ID":"39c8fb0a-1bc4-4d5c-a8c8-72816605835f","Type":"ContainerStarted","Data":"b902ee436a7d4b839b42d55246f965e6aab4cb9bf8a5d180637c0ad99a351496"} Mar 12 12:41:02.816778 master-0 kubenswrapper[13984]: I0312 12:41:02.816434 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:41:02.818187 master-0 kubenswrapper[13984]: I0312 12:41:02.818128 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" event={"ID":"d09166ec-2efd-484a-9177-7edd92703c33","Type":"ContainerStarted","Data":"367773393be5c5fd38e9e0297d622cf0425a6fe97cb58bdc4270986ecab9fd57"} Mar 12 12:41:02.820416 master-0 kubenswrapper[13984]: I0312 12:41:02.820367 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" event={"ID":"e9ba1126-67c1-4557-8a58-7931567ff140","Type":"ContainerStarted","Data":"cc152eb9c2975eced1ee24d063ef2c4174ea028e798e56f3ab27cab31b3be065"} Mar 12 12:41:02.823292 master-0 kubenswrapper[13984]: I0312 12:41:02.823238 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" event={"ID":"05ee168b-a67d-4370-86f3-f13737758499","Type":"ContainerStarted","Data":"f4f3986e7547b8bf9948c60ba9f88174e700a0f21134b601cfd5215802d8ac1e"} Mar 12 12:41:02.823402 master-0 kubenswrapper[13984]: I0312 12:41:02.823351 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:41:02.825291 master-0 kubenswrapper[13984]: I0312 12:41:02.825253 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" event={"ID":"ab2dd538-a035-4a9c-8026-604a74e84fa0","Type":"ContainerStarted","Data":"2f4a32219db58b75abb039d91b6ff43bd3a373a0a99d29c8b689bfc0896d77d5"} Mar 12 12:41:02.848934 master-0 kubenswrapper[13984]: I0312 12:41:02.848788 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" podStartSLOduration=3.766100128 podStartE2EDuration="13.848760713s" podCreationTimestamp="2026-03-12 12:40:49 +0000 UTC" firstStartedPulling="2026-03-12 12:40:52.120043694 +0000 UTC m=+984.318059186" lastFinishedPulling="2026-03-12 12:41:02.202704279 +0000 UTC m=+994.400719771" observedRunningTime="2026-03-12 12:41:02.837649843 +0000 UTC m=+995.035665335" watchObservedRunningTime="2026-03-12 12:41:02.848760713 +0000 UTC m=+995.046776245" Mar 12 12:41:02.861058 master-0 kubenswrapper[13984]: I0312 12:41:02.860977 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-pmrfm" podStartSLOduration=4.813062696 podStartE2EDuration="14.860958423s" podCreationTimestamp="2026-03-12 12:40:48 +0000 UTC" firstStartedPulling="2026-03-12 12:40:52.1164257 +0000 UTC m=+984.314441192" lastFinishedPulling="2026-03-12 12:41:02.164321427 +0000 UTC m=+994.362336919" observedRunningTime="2026-03-12 12:41:02.85843561 +0000 UTC m=+995.056451102" watchObservedRunningTime="2026-03-12 12:41:02.860958423 +0000 UTC m=+995.058973915" Mar 12 12:41:02.881502 master-0 kubenswrapper[13984]: I0312 12:41:02.881436 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-89rvv" Mar 12 12:41:02.890656 master-0 kubenswrapper[13984]: I0312 12:41:02.890434 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-j54nh" podStartSLOduration=4.771087341 podStartE2EDuration="14.890412549s" podCreationTimestamp="2026-03-12 12:40:48 +0000 UTC" firstStartedPulling="2026-03-12 12:40:52.083409852 +0000 UTC m=+984.281425344" lastFinishedPulling="2026-03-12 12:41:02.20273506 +0000 UTC m=+994.400750552" observedRunningTime="2026-03-12 12:41:02.876881 +0000 UTC m=+995.074896482" watchObservedRunningTime="2026-03-12 12:41:02.890412549 +0000 UTC m=+995.088428041" Mar 12 12:41:02.924535 master-0 kubenswrapper[13984]: I0312 12:41:02.918783 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" podStartSLOduration=3.9627788859999997 podStartE2EDuration="13.918743822s" podCreationTimestamp="2026-03-12 12:40:49 +0000 UTC" firstStartedPulling="2026-03-12 12:40:52.188989074 +0000 UTC m=+984.387004566" lastFinishedPulling="2026-03-12 12:41:02.14495402 +0000 UTC m=+994.342969502" observedRunningTime="2026-03-12 12:41:02.905310987 +0000 UTC m=+995.103326479" watchObservedRunningTime="2026-03-12 12:41:02.918743822 +0000 UTC m=+995.116759314" Mar 12 12:41:02.951596 master-0 kubenswrapper[13984]: I0312 12:41:02.951509 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-5f7678f964-cf88b" podStartSLOduration=4.498255506 podStartE2EDuration="14.951470932s" podCreationTimestamp="2026-03-12 12:40:48 +0000 UTC" firstStartedPulling="2026-03-12 12:40:51.721251592 +0000 UTC m=+983.919267084" lastFinishedPulling="2026-03-12 12:41:02.174467018 +0000 UTC m=+994.372482510" observedRunningTime="2026-03-12 12:41:02.943665138 +0000 UTC m=+995.141680660" watchObservedRunningTime="2026-03-12 12:41:02.951470932 +0000 UTC m=+995.149486424" Mar 12 12:41:06.200611 master-0 kubenswrapper[13984]: I0312 12:41:06.200549 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-56d7584dd9-8sxrm" Mar 12 12:41:09.819287 master-0 kubenswrapper[13984]: I0312 12:41:09.819209 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-7p9wl" Mar 12 12:41:25.847071 master-0 kubenswrapper[13984]: I0312 12:41:25.846987 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-787f7977d6-w89p5" Mar 12 12:41:33.163499 master-0 kubenswrapper[13984]: I0312 12:41:33.158340 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v"] Mar 12 12:41:33.163499 master-0 kubenswrapper[13984]: I0312 12:41:33.159965 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.163499 master-0 kubenswrapper[13984]: I0312 12:41:33.162432 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 12 12:41:33.175715 master-0 kubenswrapper[13984]: I0312 12:41:33.175488 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jgcbm"] Mar 12 12:41:33.180858 master-0 kubenswrapper[13984]: I0312 12:41:33.180208 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.188426 master-0 kubenswrapper[13984]: I0312 12:41:33.187958 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v"] Mar 12 12:41:33.194306 master-0 kubenswrapper[13984]: I0312 12:41:33.188775 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 12 12:41:33.194306 master-0 kubenswrapper[13984]: I0312 12:41:33.189015 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 12 12:41:33.243862 master-0 kubenswrapper[13984]: I0312 12:41:33.243796 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcdsf\" (UniqueName: \"kubernetes.io/projected/4ba27210-3da6-4946-a322-9f2d3bed28e0-kube-api-access-hcdsf\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.243973 master-0 kubenswrapper[13984]: I0312 12:41:33.243874 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccsgk\" (UniqueName: \"kubernetes.io/projected/2ccd308f-4d04-477a-9671-ced98cf499c1-kube-api-access-ccsgk\") pod \"frr-k8s-webhook-server-bcc4b6f68-c6n7v\" (UID: \"2ccd308f-4d04-477a-9671-ced98cf499c1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.244179 master-0 kubenswrapper[13984]: I0312 12:41:33.244094 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-sockets\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.244230 master-0 kubenswrapper[13984]: I0312 12:41:33.244203 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ccd308f-4d04-477a-9671-ced98cf499c1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-c6n7v\" (UID: \"2ccd308f-4d04-477a-9671-ced98cf499c1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.244535 master-0 kubenswrapper[13984]: I0312 12:41:33.244451 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.244848 master-0 kubenswrapper[13984]: I0312 12:41:33.244818 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-reloader\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.245043 master-0 kubenswrapper[13984]: I0312 12:41:33.245015 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-conf\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.245086 master-0 kubenswrapper[13984]: I0312 12:41:33.245047 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-startup\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.245575 master-0 kubenswrapper[13984]: I0312 12:41:33.245533 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics-certs\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.327974 master-0 kubenswrapper[13984]: I0312 12:41:33.327924 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-bvl5s"] Mar 12 12:41:33.329434 master-0 kubenswrapper[13984]: I0312 12:41:33.329414 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.345535 master-0 kubenswrapper[13984]: I0312 12:41:33.336500 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jtt27"] Mar 12 12:41:33.345535 master-0 kubenswrapper[13984]: I0312 12:41:33.337940 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 12 12:41:33.345535 master-0 kubenswrapper[13984]: I0312 12:41:33.338144 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.348558 master-0 kubenswrapper[13984]: I0312 12:41:33.348525 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.348707 master-0 kubenswrapper[13984]: I0312 12:41:33.348695 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-reloader\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.348790 master-0 kubenswrapper[13984]: I0312 12:41:33.348779 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-conf\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.348864 master-0 kubenswrapper[13984]: I0312 12:41:33.348852 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-startup\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.348944 master-0 kubenswrapper[13984]: I0312 12:41:33.348933 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics-certs\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.349018 master-0 kubenswrapper[13984]: I0312 12:41:33.349006 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcdsf\" (UniqueName: \"kubernetes.io/projected/4ba27210-3da6-4946-a322-9f2d3bed28e0-kube-api-access-hcdsf\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.349092 master-0 kubenswrapper[13984]: I0312 12:41:33.349079 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccsgk\" (UniqueName: \"kubernetes.io/projected/2ccd308f-4d04-477a-9671-ced98cf499c1-kube-api-access-ccsgk\") pod \"frr-k8s-webhook-server-bcc4b6f68-c6n7v\" (UID: \"2ccd308f-4d04-477a-9671-ced98cf499c1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.349175 master-0 kubenswrapper[13984]: I0312 12:41:33.349164 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-sockets\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.349249 master-0 kubenswrapper[13984]: I0312 12:41:33.349237 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ccd308f-4d04-477a-9671-ced98cf499c1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-c6n7v\" (UID: \"2ccd308f-4d04-477a-9671-ced98cf499c1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: I0312 12:41:33.351106 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-reloader\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: I0312 12:41:33.351379 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: E0312 12:41:33.351896 13984 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: E0312 12:41:33.351982 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics-certs podName:4ba27210-3da6-4946-a322-9f2d3bed28e0 nodeName:}" failed. No retries permitted until 2026-03-12 12:41:33.851953495 +0000 UTC m=+1026.049969187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics-certs") pod "frr-k8s-jgcbm" (UID: "4ba27210-3da6-4946-a322-9f2d3bed28e0") : secret "frr-k8s-certs-secret" not found Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: I0312 12:41:33.352369 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-conf\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: I0312 12:41:33.352772 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-sockets\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.355081 master-0 kubenswrapper[13984]: I0312 12:41:33.353828 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/4ba27210-3da6-4946-a322-9f2d3bed28e0-frr-startup\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.356326 master-0 kubenswrapper[13984]: I0312 12:41:33.356293 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2ccd308f-4d04-477a-9671-ced98cf499c1-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-c6n7v\" (UID: \"2ccd308f-4d04-477a-9671-ced98cf499c1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.365493 master-0 kubenswrapper[13984]: I0312 12:41:33.360016 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-bvl5s"] Mar 12 12:41:33.368993 master-0 kubenswrapper[13984]: I0312 12:41:33.368959 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 12 12:41:33.369192 master-0 kubenswrapper[13984]: I0312 12:41:33.369011 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 12 12:41:33.369316 master-0 kubenswrapper[13984]: I0312 12:41:33.369281 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 12 12:41:33.404507 master-0 kubenswrapper[13984]: I0312 12:41:33.387134 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccsgk\" (UniqueName: \"kubernetes.io/projected/2ccd308f-4d04-477a-9671-ced98cf499c1-kube-api-access-ccsgk\") pod \"frr-k8s-webhook-server-bcc4b6f68-c6n7v\" (UID: \"2ccd308f-4d04-477a-9671-ced98cf499c1\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.404507 master-0 kubenswrapper[13984]: I0312 12:41:33.387391 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcdsf\" (UniqueName: \"kubernetes.io/projected/4ba27210-3da6-4946-a322-9f2d3bed28e0-kube-api-access-hcdsf\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454373 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-metrics-certs\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454431 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t9fm\" (UniqueName: \"kubernetes.io/projected/57f59392-f9b6-4b99-ad37-33c5b7c04adb-kube-api-access-9t9fm\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454492 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57f59392-f9b6-4b99-ad37-33c5b7c04adb-metrics-certs\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454543 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6b5193f-85dd-431d-8de7-27731f2cefd4-metallb-excludel2\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454561 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnpm\" (UniqueName: \"kubernetes.io/projected/c6b5193f-85dd-431d-8de7-27731f2cefd4-kube-api-access-ztnpm\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454620 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.457497 master-0 kubenswrapper[13984]: I0312 12:41:33.454641 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f59392-f9b6-4b99-ad37-33c5b7c04adb-cert\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.510233 master-0 kubenswrapper[13984]: I0312 12:41:33.510185 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555564 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-metrics-certs\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555627 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t9fm\" (UniqueName: \"kubernetes.io/projected/57f59392-f9b6-4b99-ad37-33c5b7c04adb-kube-api-access-9t9fm\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555654 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57f59392-f9b6-4b99-ad37-33c5b7c04adb-metrics-certs\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555696 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6b5193f-85dd-431d-8de7-27731f2cefd4-metallb-excludel2\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555714 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnpm\" (UniqueName: \"kubernetes.io/projected/c6b5193f-85dd-431d-8de7-27731f2cefd4-kube-api-access-ztnpm\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555778 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.555800 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f59392-f9b6-4b99-ad37-33c5b7c04adb-cert\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: E0312 12:41:33.557792 13984 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: E0312 12:41:33.557901 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist podName:c6b5193f-85dd-431d-8de7-27731f2cefd4 nodeName:}" failed. No retries permitted until 2026-03-12 12:41:34.057868148 +0000 UTC m=+1026.255883640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist") pod "speaker-jtt27" (UID: "c6b5193f-85dd-431d-8de7-27731f2cefd4") : secret "metallb-memberlist" not found Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.558415 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/c6b5193f-85dd-431d-8de7-27731f2cefd4-metallb-excludel2\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.559235 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-metrics-certs\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.562510 master-0 kubenswrapper[13984]: I0312 12:41:33.560501 13984 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 12 12:41:33.573503 master-0 kubenswrapper[13984]: I0312 12:41:33.564112 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/57f59392-f9b6-4b99-ad37-33c5b7c04adb-metrics-certs\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.573503 master-0 kubenswrapper[13984]: I0312 12:41:33.572780 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/57f59392-f9b6-4b99-ad37-33c5b7c04adb-cert\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.585093 master-0 kubenswrapper[13984]: I0312 12:41:33.585045 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t9fm\" (UniqueName: \"kubernetes.io/projected/57f59392-f9b6-4b99-ad37-33c5b7c04adb-kube-api-access-9t9fm\") pod \"controller-7bb4cc7c98-bvl5s\" (UID: \"57f59392-f9b6-4b99-ad37-33c5b7c04adb\") " pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.597403 master-0 kubenswrapper[13984]: I0312 12:41:33.596106 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnpm\" (UniqueName: \"kubernetes.io/projected/c6b5193f-85dd-431d-8de7-27731f2cefd4-kube-api-access-ztnpm\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:33.740619 master-0 kubenswrapper[13984]: I0312 12:41:33.740456 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:33.861599 master-0 kubenswrapper[13984]: I0312 12:41:33.860421 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics-certs\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:33.894955 master-0 kubenswrapper[13984]: I0312 12:41:33.886624 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ba27210-3da6-4946-a322-9f2d3bed28e0-metrics-certs\") pod \"frr-k8s-jgcbm\" (UID: \"4ba27210-3da6-4946-a322-9f2d3bed28e0\") " pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:34.065777 master-0 kubenswrapper[13984]: I0312 12:41:34.065587 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:34.065777 master-0 kubenswrapper[13984]: E0312 12:41:34.065733 13984 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 12 12:41:34.066881 master-0 kubenswrapper[13984]: E0312 12:41:34.065789 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist podName:c6b5193f-85dd-431d-8de7-27731f2cefd4 nodeName:}" failed. No retries permitted until 2026-03-12 12:41:35.065773154 +0000 UTC m=+1027.263788646 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist") pod "speaker-jtt27" (UID: "c6b5193f-85dd-431d-8de7-27731f2cefd4") : secret "metallb-memberlist" not found Mar 12 12:41:34.134539 master-0 kubenswrapper[13984]: I0312 12:41:34.130916 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v"] Mar 12 12:41:34.157530 master-0 kubenswrapper[13984]: I0312 12:41:34.153553 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:34.265261 master-0 kubenswrapper[13984]: I0312 12:41:34.265189 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-bvl5s"] Mar 12 12:41:35.088030 master-0 kubenswrapper[13984]: I0312 12:41:35.087970 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:35.091480 master-0 kubenswrapper[13984]: I0312 12:41:35.091432 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/c6b5193f-85dd-431d-8de7-27731f2cefd4-memberlist\") pod \"speaker-jtt27\" (UID: \"c6b5193f-85dd-431d-8de7-27731f2cefd4\") " pod="metallb-system/speaker-jtt27" Mar 12 12:41:35.104001 master-0 kubenswrapper[13984]: I0312 12:41:35.103937 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"afeaf4cf5e0d6ca8edbe194175e67e13739936c21dac0d7d762a1d4699dad8d5"} Mar 12 12:41:35.105464 master-0 kubenswrapper[13984]: I0312 12:41:35.105441 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-bvl5s" event={"ID":"57f59392-f9b6-4b99-ad37-33c5b7c04adb","Type":"ContainerStarted","Data":"7d50a0a8c88bdebad8181a18026b9258a4e8dfcd4355adcf320d9d46c71a8eb2"} Mar 12 12:41:35.105558 master-0 kubenswrapper[13984]: I0312 12:41:35.105466 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-bvl5s" event={"ID":"57f59392-f9b6-4b99-ad37-33c5b7c04adb","Type":"ContainerStarted","Data":"56807ca0b18b2a3d6dc1aaeecefe53855bcdae832eceab4f5fa5d9357c301b6d"} Mar 12 12:41:35.106405 master-0 kubenswrapper[13984]: I0312 12:41:35.106383 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" event={"ID":"2ccd308f-4d04-477a-9671-ced98cf499c1","Type":"ContainerStarted","Data":"85af688ddb6b89d52a3abc11d5d6277a08fe83cb37787c0548c85d249edfe3b1"} Mar 12 12:41:35.211365 master-0 kubenswrapper[13984]: I0312 12:41:35.211297 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg"] Mar 12 12:41:35.212941 master-0 kubenswrapper[13984]: I0312 12:41:35.212908 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" Mar 12 12:41:35.226367 master-0 kubenswrapper[13984]: I0312 12:41:35.226321 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg"] Mar 12 12:41:35.296323 master-0 kubenswrapper[13984]: I0312 12:41:35.295675 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs6vp\" (UniqueName: \"kubernetes.io/projected/8e356084-3bba-4665-b579-fb8e0a999783-kube-api-access-zs6vp\") pod \"nmstate-metrics-9b8c8685d-hmdgg\" (UID: \"8e356084-3bba-4665-b579-fb8e0a999783\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" Mar 12 12:41:35.297088 master-0 kubenswrapper[13984]: I0312 12:41:35.297046 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jtt27" Mar 12 12:41:35.317704 master-0 kubenswrapper[13984]: I0312 12:41:35.317617 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-zhp64"] Mar 12 12:41:35.321817 master-0 kubenswrapper[13984]: I0312 12:41:35.319341 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:35.322629 master-0 kubenswrapper[13984]: I0312 12:41:35.322572 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-tpsct"] Mar 12 12:41:35.322795 master-0 kubenswrapper[13984]: I0312 12:41:35.322749 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 12 12:41:35.324662 master-0 kubenswrapper[13984]: I0312 12:41:35.323898 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.359429 master-0 kubenswrapper[13984]: I0312 12:41:35.359384 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-zhp64"] Mar 12 12:41:35.397297 master-0 kubenswrapper[13984]: I0312 12:41:35.397107 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kl58\" (UniqueName: \"kubernetes.io/projected/a379bead-8afc-458e-90cf-5859fdd27e8c-kube-api-access-6kl58\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:35.397297 master-0 kubenswrapper[13984]: I0312 12:41:35.397161 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-nmstate-lock\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.397297 master-0 kubenswrapper[13984]: I0312 12:41:35.397184 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-ovs-socket\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.397297 master-0 kubenswrapper[13984]: I0312 12:41:35.397214 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs6vp\" (UniqueName: \"kubernetes.io/projected/8e356084-3bba-4665-b579-fb8e0a999783-kube-api-access-zs6vp\") pod \"nmstate-metrics-9b8c8685d-hmdgg\" (UID: \"8e356084-3bba-4665-b579-fb8e0a999783\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" Mar 12 12:41:35.397297 master-0 kubenswrapper[13984]: I0312 12:41:35.397243 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr7rm\" (UniqueName: \"kubernetes.io/projected/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-kube-api-access-mr7rm\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.397297 master-0 kubenswrapper[13984]: I0312 12:41:35.397300 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a379bead-8afc-458e-90cf-5859fdd27e8c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:35.397736 master-0 kubenswrapper[13984]: I0312 12:41:35.397355 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-dbus-socket\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.417266 master-0 kubenswrapper[13984]: I0312 12:41:35.416775 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs6vp\" (UniqueName: \"kubernetes.io/projected/8e356084-3bba-4665-b579-fb8e0a999783-kube-api-access-zs6vp\") pod \"nmstate-metrics-9b8c8685d-hmdgg\" (UID: \"8e356084-3bba-4665-b579-fb8e0a999783\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" Mar 12 12:41:35.499221 master-0 kubenswrapper[13984]: I0312 12:41:35.499122 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kl58\" (UniqueName: \"kubernetes.io/projected/a379bead-8afc-458e-90cf-5859fdd27e8c-kube-api-access-6kl58\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:35.499221 master-0 kubenswrapper[13984]: I0312 12:41:35.499186 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-nmstate-lock\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.499221 master-0 kubenswrapper[13984]: I0312 12:41:35.499207 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-ovs-socket\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.499221 master-0 kubenswrapper[13984]: I0312 12:41:35.499232 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr7rm\" (UniqueName: \"kubernetes.io/projected/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-kube-api-access-mr7rm\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.499587 master-0 kubenswrapper[13984]: I0312 12:41:35.499276 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a379bead-8afc-458e-90cf-5859fdd27e8c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:35.499587 master-0 kubenswrapper[13984]: I0312 12:41:35.499316 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-dbus-socket\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.499587 master-0 kubenswrapper[13984]: I0312 12:41:35.499407 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-dbus-socket\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.499587 master-0 kubenswrapper[13984]: I0312 12:41:35.499463 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-nmstate-lock\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.499587 master-0 kubenswrapper[13984]: I0312 12:41:35.499470 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-ovs-socket\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.501094 master-0 kubenswrapper[13984]: E0312 12:41:35.500082 13984 secret.go:189] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 12 12:41:35.501094 master-0 kubenswrapper[13984]: E0312 12:41:35.500170 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a379bead-8afc-458e-90cf-5859fdd27e8c-tls-key-pair podName:a379bead-8afc-458e-90cf-5859fdd27e8c nodeName:}" failed. No retries permitted until 2026-03-12 12:41:36.000147497 +0000 UTC m=+1028.198162989 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/a379bead-8afc-458e-90cf-5859fdd27e8c-tls-key-pair") pod "nmstate-webhook-5f558f5558-zhp64" (UID: "a379bead-8afc-458e-90cf-5859fdd27e8c") : secret "openshift-nmstate-webhook" not found Mar 12 12:41:35.524019 master-0 kubenswrapper[13984]: I0312 12:41:35.523954 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr7rm\" (UniqueName: \"kubernetes.io/projected/16b99c26-f5eb-4c89-ac3c-e08d2a3a9638-kube-api-access-mr7rm\") pod \"nmstate-handler-tpsct\" (UID: \"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638\") " pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.532893 master-0 kubenswrapper[13984]: I0312 12:41:35.532339 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" Mar 12 12:41:35.560578 master-0 kubenswrapper[13984]: I0312 12:41:35.560374 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kl58\" (UniqueName: \"kubernetes.io/projected/a379bead-8afc-458e-90cf-5859fdd27e8c-kube-api-access-6kl58\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:35.594304 master-0 kubenswrapper[13984]: I0312 12:41:35.589864 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b"] Mar 12 12:41:35.594304 master-0 kubenswrapper[13984]: I0312 12:41:35.591146 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.599661 master-0 kubenswrapper[13984]: I0312 12:41:35.595180 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 12 12:41:35.599661 master-0 kubenswrapper[13984]: I0312 12:41:35.595226 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 12 12:41:35.612941 master-0 kubenswrapper[13984]: I0312 12:41:35.612872 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b"] Mar 12 12:41:35.705550 master-0 kubenswrapper[13984]: I0312 12:41:35.705422 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrrxw\" (UniqueName: \"kubernetes.io/projected/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-kube-api-access-wrrxw\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.705790 master-0 kubenswrapper[13984]: I0312 12:41:35.705772 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.705889 master-0 kubenswrapper[13984]: I0312 12:41:35.705876 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.766380 master-0 kubenswrapper[13984]: I0312 12:41:35.750117 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-79c4496f46-rfmzb"] Mar 12 12:41:35.766380 master-0 kubenswrapper[13984]: I0312 12:41:35.752292 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.786722 master-0 kubenswrapper[13984]: I0312 12:41:35.786565 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79c4496f46-rfmzb"] Mar 12 12:41:35.807643 master-0 kubenswrapper[13984]: I0312 12:41:35.807575 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.807643 master-0 kubenswrapper[13984]: I0312 12:41:35.807634 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-service-ca\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.807943 master-0 kubenswrapper[13984]: I0312 12:41:35.807901 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-oauth-serving-cert\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.808041 master-0 kubenswrapper[13984]: I0312 12:41:35.808016 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.808225 master-0 kubenswrapper[13984]: I0312 12:41:35.808200 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-serving-cert\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.808275 master-0 kubenswrapper[13984]: I0312 12:41:35.808233 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-config\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.808317 master-0 kubenswrapper[13984]: I0312 12:41:35.808274 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zjpl\" (UniqueName: \"kubernetes.io/projected/e0ace1d1-a0ed-404b-8a89-117a6b179047-kube-api-access-5zjpl\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.808367 master-0 kubenswrapper[13984]: I0312 12:41:35.808330 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrrxw\" (UniqueName: \"kubernetes.io/projected/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-kube-api-access-wrrxw\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.808367 master-0 kubenswrapper[13984]: I0312 12:41:35.808365 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-oauth-config\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.808478 master-0 kubenswrapper[13984]: I0312 12:41:35.808453 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-trusted-ca-bundle\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.809461 master-0 kubenswrapper[13984]: I0312 12:41:35.808965 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.821174 master-0 kubenswrapper[13984]: I0312 12:41:35.821136 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:35.844638 master-0 kubenswrapper[13984]: I0312 12:41:35.828898 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.844638 master-0 kubenswrapper[13984]: I0312 12:41:35.829212 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrrxw\" (UniqueName: \"kubernetes.io/projected/2a98b253-d5ac-4e39-ac14-d24c5dad3ec2-kube-api-access-wrrxw\") pod \"nmstate-console-plugin-86f58fcf4-9n27b\" (UID: \"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914368 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-oauth-serving-cert\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914532 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-serving-cert\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914561 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-config\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914595 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zjpl\" (UniqueName: \"kubernetes.io/projected/e0ace1d1-a0ed-404b-8a89-117a6b179047-kube-api-access-5zjpl\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914651 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-oauth-config\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914717 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-trusted-ca-bundle\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.914736 master-0 kubenswrapper[13984]: I0312 12:41:35.914752 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-service-ca\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.917294 master-0 kubenswrapper[13984]: I0312 12:41:35.917255 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-config\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.917814 master-0 kubenswrapper[13984]: I0312 12:41:35.917779 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-oauth-serving-cert\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.920357 master-0 kubenswrapper[13984]: I0312 12:41:35.919008 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-trusted-ca-bundle\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.922370 master-0 kubenswrapper[13984]: I0312 12:41:35.922314 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-oauth-config\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.935035 master-0 kubenswrapper[13984]: I0312 12:41:35.915615 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e0ace1d1-a0ed-404b-8a89-117a6b179047-service-ca\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.935245 master-0 kubenswrapper[13984]: I0312 12:41:35.935171 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e0ace1d1-a0ed-404b-8a89-117a6b179047-console-serving-cert\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.938068 master-0 kubenswrapper[13984]: I0312 12:41:35.937998 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zjpl\" (UniqueName: \"kubernetes.io/projected/e0ace1d1-a0ed-404b-8a89-117a6b179047-kube-api-access-5zjpl\") pod \"console-79c4496f46-rfmzb\" (UID: \"e0ace1d1-a0ed-404b-8a89-117a6b179047\") " pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:35.965091 master-0 kubenswrapper[13984]: I0312 12:41:35.964999 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" Mar 12 12:41:36.015981 master-0 kubenswrapper[13984]: I0312 12:41:36.015924 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a379bead-8afc-458e-90cf-5859fdd27e8c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:36.022370 master-0 kubenswrapper[13984]: I0312 12:41:36.022335 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/a379bead-8afc-458e-90cf-5859fdd27e8c-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-zhp64\" (UID: \"a379bead-8afc-458e-90cf-5859fdd27e8c\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:36.058431 master-0 kubenswrapper[13984]: I0312 12:41:36.058366 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg"] Mar 12 12:41:36.088171 master-0 kubenswrapper[13984]: I0312 12:41:36.087699 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:36.133161 master-0 kubenswrapper[13984]: I0312 12:41:36.133054 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" event={"ID":"8e356084-3bba-4665-b579-fb8e0a999783","Type":"ContainerStarted","Data":"11b1bc3ab19cbebf4239f1096a6be766f07978e55efa6a0ffafa07768243f06b"} Mar 12 12:41:36.136620 master-0 kubenswrapper[13984]: I0312 12:41:36.136561 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jtt27" event={"ID":"c6b5193f-85dd-431d-8de7-27731f2cefd4","Type":"ContainerStarted","Data":"ba9b1dad10e35d649eddc93bd24586c07ab044c1c7b24acaa1347bb7be16f98b"} Mar 12 12:41:36.136712 master-0 kubenswrapper[13984]: I0312 12:41:36.136628 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jtt27" event={"ID":"c6b5193f-85dd-431d-8de7-27731f2cefd4","Type":"ContainerStarted","Data":"ca88f9d6f9a5edccb5fcc15bc803a5fde9549db3cf03da9a9a8cec0c10c4bdf2"} Mar 12 12:41:36.137879 master-0 kubenswrapper[13984]: I0312 12:41:36.137842 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tpsct" event={"ID":"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638","Type":"ContainerStarted","Data":"1d5a8434e49c19f26ceb64441615ba2c2f95343a9ee65ff991aa9d5f1a108830"} Mar 12 12:41:36.333345 master-0 kubenswrapper[13984]: I0312 12:41:36.333267 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:36.435657 master-0 kubenswrapper[13984]: I0312 12:41:36.435587 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b"] Mar 12 12:41:36.458396 master-0 kubenswrapper[13984]: W0312 12:41:36.455661 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a98b253_d5ac_4e39_ac14_d24c5dad3ec2.slice/crio-8d7b8eb5cd4b74d603387b8d983a0a7cec506d5a5737c6446ceee2eb7f267fdf WatchSource:0}: Error finding container 8d7b8eb5cd4b74d603387b8d983a0a7cec506d5a5737c6446ceee2eb7f267fdf: Status 404 returned error can't find the container with id 8d7b8eb5cd4b74d603387b8d983a0a7cec506d5a5737c6446ceee2eb7f267fdf Mar 12 12:41:36.880374 master-0 kubenswrapper[13984]: W0312 12:41:36.879746 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0ace1d1_a0ed_404b_8a89_117a6b179047.slice/crio-483e79f09700ffd9bbe1f1978e0db647df3606484daa475329301364d5bb513a WatchSource:0}: Error finding container 483e79f09700ffd9bbe1f1978e0db647df3606484daa475329301364d5bb513a: Status 404 returned error can't find the container with id 483e79f09700ffd9bbe1f1978e0db647df3606484daa475329301364d5bb513a Mar 12 12:41:36.886031 master-0 kubenswrapper[13984]: I0312 12:41:36.885966 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-79c4496f46-rfmzb"] Mar 12 12:41:36.922210 master-0 kubenswrapper[13984]: I0312 12:41:36.922009 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-zhp64"] Mar 12 12:41:37.166380 master-0 kubenswrapper[13984]: I0312 12:41:37.166197 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" event={"ID":"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2","Type":"ContainerStarted","Data":"8d7b8eb5cd4b74d603387b8d983a0a7cec506d5a5737c6446ceee2eb7f267fdf"} Mar 12 12:41:37.169833 master-0 kubenswrapper[13984]: I0312 12:41:37.169782 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79c4496f46-rfmzb" event={"ID":"e0ace1d1-a0ed-404b-8a89-117a6b179047","Type":"ContainerStarted","Data":"9a77a6935cece4e8e7c3cf48bd43645432fc8c184fbdc54f506f51db19d4aa6b"} Mar 12 12:41:37.169912 master-0 kubenswrapper[13984]: I0312 12:41:37.169844 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-79c4496f46-rfmzb" event={"ID":"e0ace1d1-a0ed-404b-8a89-117a6b179047","Type":"ContainerStarted","Data":"483e79f09700ffd9bbe1f1978e0db647df3606484daa475329301364d5bb513a"} Mar 12 12:41:37.176695 master-0 kubenswrapper[13984]: I0312 12:41:37.175952 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" event={"ID":"a379bead-8afc-458e-90cf-5859fdd27e8c","Type":"ContainerStarted","Data":"e8af3ff4c5c745c9efde5573f9e80b14ca167559b1359296ab0b910f51e8e707"} Mar 12 12:41:37.198295 master-0 kubenswrapper[13984]: I0312 12:41:37.198163 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-79c4496f46-rfmzb" podStartSLOduration=2.198132259 podStartE2EDuration="2.198132259s" podCreationTimestamp="2026-03-12 12:41:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:41:37.193604369 +0000 UTC m=+1029.391619881" watchObservedRunningTime="2026-03-12 12:41:37.198132259 +0000 UTC m=+1029.396147751" Mar 12 12:41:39.210271 master-0 kubenswrapper[13984]: I0312 12:41:39.210210 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jtt27" event={"ID":"c6b5193f-85dd-431d-8de7-27731f2cefd4","Type":"ContainerStarted","Data":"c9c160f122f6541a25742e9e8df4fb1ae107dfb2c35287c64f351e667db14fc4"} Mar 12 12:41:39.211455 master-0 kubenswrapper[13984]: I0312 12:41:39.211435 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jtt27" Mar 12 12:41:39.214696 master-0 kubenswrapper[13984]: I0312 12:41:39.214673 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-bvl5s" event={"ID":"57f59392-f9b6-4b99-ad37-33c5b7c04adb","Type":"ContainerStarted","Data":"26a26baf2a5adcc26ffc841a1ee9de11ed6af58b43383a3820848cf91e2f81b7"} Mar 12 12:41:39.215237 master-0 kubenswrapper[13984]: I0312 12:41:39.215209 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:39.333501 master-0 kubenswrapper[13984]: I0312 12:41:39.333402 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jtt27" podStartSLOduration=4.145161859 podStartE2EDuration="6.33338498s" podCreationTimestamp="2026-03-12 12:41:33 +0000 UTC" firstStartedPulling="2026-03-12 12:41:35.842603602 +0000 UTC m=+1028.040619094" lastFinishedPulling="2026-03-12 12:41:38.030826723 +0000 UTC m=+1030.228842215" observedRunningTime="2026-03-12 12:41:39.329588621 +0000 UTC m=+1031.527604123" watchObservedRunningTime="2026-03-12 12:41:39.33338498 +0000 UTC m=+1031.531400472" Mar 12 12:41:39.368158 master-0 kubenswrapper[13984]: I0312 12:41:39.368067 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-bvl5s" podStartSLOduration=2.839869374 podStartE2EDuration="6.368046656s" podCreationTimestamp="2026-03-12 12:41:33 +0000 UTC" firstStartedPulling="2026-03-12 12:41:34.497645297 +0000 UTC m=+1026.695660789" lastFinishedPulling="2026-03-12 12:41:38.025822579 +0000 UTC m=+1030.223838071" observedRunningTime="2026-03-12 12:41:39.361652822 +0000 UTC m=+1031.559668324" watchObservedRunningTime="2026-03-12 12:41:39.368046656 +0000 UTC m=+1031.566062148" Mar 12 12:41:43.259855 master-0 kubenswrapper[13984]: I0312 12:41:43.259801 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-tpsct" event={"ID":"16b99c26-f5eb-4c89-ac3c-e08d2a3a9638","Type":"ContainerStarted","Data":"c473a33155c62ceef20e8489fe5795abef68e0e0e809a42c28a73e25a2068986"} Mar 12 12:41:43.260688 master-0 kubenswrapper[13984]: I0312 12:41:43.259941 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:43.262443 master-0 kubenswrapper[13984]: I0312 12:41:43.262395 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" event={"ID":"a379bead-8afc-458e-90cf-5859fdd27e8c","Type":"ContainerStarted","Data":"84a25bfb6746e7fbef268d34f07c9da22d91a223173709f362b77de318c69130"} Mar 12 12:41:43.262653 master-0 kubenswrapper[13984]: I0312 12:41:43.262613 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:41:43.264865 master-0 kubenswrapper[13984]: I0312 12:41:43.264830 13984 generic.go:334] "Generic (PLEG): container finished" podID="4ba27210-3da6-4946-a322-9f2d3bed28e0" containerID="eb30cfea14827c37c1ef9d99cca3e6a6134d2ed89820e4b2a66c4e3aa199c0d7" exitCode=0 Mar 12 12:41:43.264948 master-0 kubenswrapper[13984]: I0312 12:41:43.264875 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerDied","Data":"eb30cfea14827c37c1ef9d99cca3e6a6134d2ed89820e4b2a66c4e3aa199c0d7"} Mar 12 12:41:43.268995 master-0 kubenswrapper[13984]: I0312 12:41:43.268959 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" event={"ID":"2ccd308f-4d04-477a-9671-ced98cf499c1","Type":"ContainerStarted","Data":"5ca3f2b12636cd78b4735814679ed196a961a9cd4db3183fa133d835d4ae588f"} Mar 12 12:41:43.270080 master-0 kubenswrapper[13984]: I0312 12:41:43.269771 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:43.280089 master-0 kubenswrapper[13984]: I0312 12:41:43.280029 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-tpsct" podStartSLOduration=2.108693942 podStartE2EDuration="8.28001337s" podCreationTimestamp="2026-03-12 12:41:35 +0000 UTC" firstStartedPulling="2026-03-12 12:41:35.874470977 +0000 UTC m=+1028.072486469" lastFinishedPulling="2026-03-12 12:41:42.045790405 +0000 UTC m=+1034.243805897" observedRunningTime="2026-03-12 12:41:43.277976071 +0000 UTC m=+1035.475991563" watchObservedRunningTime="2026-03-12 12:41:43.28001337 +0000 UTC m=+1035.478028862" Mar 12 12:41:43.324512 master-0 kubenswrapper[13984]: I0312 12:41:43.318147 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" podStartSLOduration=3.164471562 podStartE2EDuration="8.318128394s" podCreationTimestamp="2026-03-12 12:41:35 +0000 UTC" firstStartedPulling="2026-03-12 12:41:36.9083927 +0000 UTC m=+1029.106408182" lastFinishedPulling="2026-03-12 12:41:42.062049522 +0000 UTC m=+1034.260065014" observedRunningTime="2026-03-12 12:41:43.294795254 +0000 UTC m=+1035.492810736" watchObservedRunningTime="2026-03-12 12:41:43.318128394 +0000 UTC m=+1035.516143886" Mar 12 12:41:43.336143 master-0 kubenswrapper[13984]: I0312 12:41:43.334761 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" podStartSLOduration=2.40449062 podStartE2EDuration="10.334739821s" podCreationTimestamp="2026-03-12 12:41:33 +0000 UTC" firstStartedPulling="2026-03-12 12:41:34.133105538 +0000 UTC m=+1026.331121030" lastFinishedPulling="2026-03-12 12:41:42.063354739 +0000 UTC m=+1034.261370231" observedRunningTime="2026-03-12 12:41:43.333973539 +0000 UTC m=+1035.531989051" watchObservedRunningTime="2026-03-12 12:41:43.334739821 +0000 UTC m=+1035.532755323" Mar 12 12:41:44.283353 master-0 kubenswrapper[13984]: I0312 12:41:44.283307 13984 generic.go:334] "Generic (PLEG): container finished" podID="4ba27210-3da6-4946-a322-9f2d3bed28e0" containerID="704ad24d146cac10c9a8f414244d43c5a586c4a4b71b9b7980278413364ee24d" exitCode=0 Mar 12 12:41:44.284009 master-0 kubenswrapper[13984]: I0312 12:41:44.283385 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerDied","Data":"704ad24d146cac10c9a8f414244d43c5a586c4a4b71b9b7980278413364ee24d"} Mar 12 12:41:45.293859 master-0 kubenswrapper[13984]: I0312 12:41:45.293808 13984 generic.go:334] "Generic (PLEG): container finished" podID="4ba27210-3da6-4946-a322-9f2d3bed28e0" containerID="9f57850a112d5696e10a6660493540ce7f5d840e5fb67c83ba7fdd67a44aee49" exitCode=0 Mar 12 12:41:45.294334 master-0 kubenswrapper[13984]: I0312 12:41:45.293885 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerDied","Data":"9f57850a112d5696e10a6660493540ce7f5d840e5fb67c83ba7fdd67a44aee49"} Mar 12 12:41:45.301577 master-0 kubenswrapper[13984]: I0312 12:41:45.300847 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jtt27" Mar 12 12:41:46.089840 master-0 kubenswrapper[13984]: I0312 12:41:46.089778 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:46.089840 master-0 kubenswrapper[13984]: I0312 12:41:46.089822 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:46.095570 master-0 kubenswrapper[13984]: I0312 12:41:46.095520 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:46.305643 master-0 kubenswrapper[13984]: I0312 12:41:46.305579 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"c3f6747f9165e41641e9338ae66332269c3229bf0e0dfd0c42531193c60001b1"} Mar 12 12:41:46.305643 master-0 kubenswrapper[13984]: I0312 12:41:46.305638 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"e241fc3b53fb95a10737bea472a85cf777824c6874878f29cc73db44e8d0ee67"} Mar 12 12:41:46.306323 master-0 kubenswrapper[13984]: I0312 12:41:46.305654 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"75005b9a71eebc2a13fc589c3f4a0cd875194cd585d976e46c7c4f34037fd0d0"} Mar 12 12:41:46.306323 master-0 kubenswrapper[13984]: I0312 12:41:46.305665 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"c21f3cce783d6661c4718e1088eb9d34b6c4556fea55e2d3b6a3b60b663881e4"} Mar 12 12:41:46.306323 master-0 kubenswrapper[13984]: I0312 12:41:46.305677 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"550f17ddc0332502d336907d48939e8dd35b9c37bbaeb62b4fdca368f0f3fc94"} Mar 12 12:41:46.310249 master-0 kubenswrapper[13984]: I0312 12:41:46.309982 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-79c4496f46-rfmzb" Mar 12 12:41:46.380932 master-0 kubenswrapper[13984]: I0312 12:41:46.380544 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f76bf8d9-qghrl"] Mar 12 12:41:47.321282 master-0 kubenswrapper[13984]: I0312 12:41:47.321227 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jgcbm" event={"ID":"4ba27210-3da6-4946-a322-9f2d3bed28e0","Type":"ContainerStarted","Data":"641c7ac9abae28288b340ee59785288755a32f645b28eb155fde7934a157b20e"} Mar 12 12:41:47.321946 master-0 kubenswrapper[13984]: I0312 12:41:47.321905 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:47.357412 master-0 kubenswrapper[13984]: I0312 12:41:47.357342 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jgcbm" podStartSLOduration=6.6102724219999995 podStartE2EDuration="14.357326182s" podCreationTimestamp="2026-03-12 12:41:33 +0000 UTC" firstStartedPulling="2026-03-12 12:41:34.296880691 +0000 UTC m=+1026.494896173" lastFinishedPulling="2026-03-12 12:41:42.043934451 +0000 UTC m=+1034.241949933" observedRunningTime="2026-03-12 12:41:47.353217004 +0000 UTC m=+1039.551232506" watchObservedRunningTime="2026-03-12 12:41:47.357326182 +0000 UTC m=+1039.555341674" Mar 12 12:41:48.334719 master-0 kubenswrapper[13984]: I0312 12:41:48.334563 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" event={"ID":"8e356084-3bba-4665-b579-fb8e0a999783","Type":"ContainerStarted","Data":"12d7945782e674a3e3d87a81ef4fb75c5a579a1d05dbfaf53bb247c83ae26360"} Mar 12 12:41:48.334719 master-0 kubenswrapper[13984]: I0312 12:41:48.334629 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" event={"ID":"8e356084-3bba-4665-b579-fb8e0a999783","Type":"ContainerStarted","Data":"a01368e4348317620a418d091c2660119a3cfbc3297dd84e7a52aed8037e49c0"} Mar 12 12:41:48.365439 master-0 kubenswrapper[13984]: I0312 12:41:48.363923 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-hmdgg" podStartSLOduration=1.476495125 podStartE2EDuration="13.363903818s" podCreationTimestamp="2026-03-12 12:41:35 +0000 UTC" firstStartedPulling="2026-03-12 12:41:36.065302187 +0000 UTC m=+1028.263317679" lastFinishedPulling="2026-03-12 12:41:47.95271087 +0000 UTC m=+1040.150726372" observedRunningTime="2026-03-12 12:41:48.360025487 +0000 UTC m=+1040.558040989" watchObservedRunningTime="2026-03-12 12:41:48.363903818 +0000 UTC m=+1040.561919310" Mar 12 12:41:49.154615 master-0 kubenswrapper[13984]: I0312 12:41:49.154551 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:49.197061 master-0 kubenswrapper[13984]: I0312 12:41:49.197016 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:41:49.344025 master-0 kubenswrapper[13984]: I0312 12:41:49.343892 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" event={"ID":"2a98b253-d5ac-4e39-ac14-d24c5dad3ec2","Type":"ContainerStarted","Data":"1f12d99accee05ec6eb8ffb308ed4aae3e772d102e41e3eb8f1a429fa95028d5"} Mar 12 12:41:49.410656 master-0 kubenswrapper[13984]: I0312 12:41:49.410093 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-9n27b" podStartSLOduration=1.816012307 podStartE2EDuration="14.410041652s" podCreationTimestamp="2026-03-12 12:41:35 +0000 UTC" firstStartedPulling="2026-03-12 12:41:36.462240727 +0000 UTC m=+1028.660256219" lastFinishedPulling="2026-03-12 12:41:49.056270062 +0000 UTC m=+1041.254285564" observedRunningTime="2026-03-12 12:41:49.360471828 +0000 UTC m=+1041.558487320" watchObservedRunningTime="2026-03-12 12:41:49.410041652 +0000 UTC m=+1041.608057144" Mar 12 12:41:50.845258 master-0 kubenswrapper[13984]: I0312 12:41:50.845167 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-tpsct" Mar 12 12:41:53.515913 master-0 kubenswrapper[13984]: I0312 12:41:53.515817 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-c6n7v" Mar 12 12:41:53.751517 master-0 kubenswrapper[13984]: I0312 12:41:53.751439 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-bvl5s" Mar 12 12:41:56.340755 master-0 kubenswrapper[13984]: I0312 12:41:56.340702 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-zhp64" Mar 12 12:42:01.236925 master-0 kubenswrapper[13984]: I0312 12:42:01.236832 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-5ngc6"] Mar 12 12:42:01.238203 master-0 kubenswrapper[13984]: I0312 12:42:01.238100 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.240400 master-0 kubenswrapper[13984]: I0312 12:42:01.240336 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 12 12:42:01.262839 master-0 kubenswrapper[13984]: I0312 12:42:01.261517 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-5ngc6"] Mar 12 12:42:01.370430 master-0 kubenswrapper[13984]: I0312 12:42:01.370348 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9466ac17-5416-449b-9492-e88ff8ff8cca-metrics-cert\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370430 master-0 kubenswrapper[13984]: I0312 12:42:01.370426 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-sys\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370770 master-0 kubenswrapper[13984]: I0312 12:42:01.370453 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-lvmd-config\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370770 master-0 kubenswrapper[13984]: I0312 12:42:01.370548 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-csi-plugin-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370770 master-0 kubenswrapper[13984]: I0312 12:42:01.370589 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7lfh\" (UniqueName: \"kubernetes.io/projected/9466ac17-5416-449b-9492-e88ff8ff8cca-kube-api-access-m7lfh\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370770 master-0 kubenswrapper[13984]: I0312 12:42:01.370651 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-pod-volumes-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370770 master-0 kubenswrapper[13984]: I0312 12:42:01.370719 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-run-udev\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370981 master-0 kubenswrapper[13984]: I0312 12:42:01.370815 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-device-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370981 master-0 kubenswrapper[13984]: I0312 12:42:01.370886 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-registration-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.370981 master-0 kubenswrapper[13984]: I0312 12:42:01.370938 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-node-plugin-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.371107 master-0 kubenswrapper[13984]: I0312 12:42:01.371006 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-file-lock-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472184 master-0 kubenswrapper[13984]: I0312 12:42:01.472118 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9466ac17-5416-449b-9492-e88ff8ff8cca-metrics-cert\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472184 master-0 kubenswrapper[13984]: I0312 12:42:01.472185 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-sys\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472538 master-0 kubenswrapper[13984]: I0312 12:42:01.472440 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-lvmd-config\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472702 master-0 kubenswrapper[13984]: I0312 12:42:01.472675 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-csi-plugin-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472765 master-0 kubenswrapper[13984]: I0312 12:42:01.472713 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7lfh\" (UniqueName: \"kubernetes.io/projected/9466ac17-5416-449b-9492-e88ff8ff8cca-kube-api-access-m7lfh\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472841 master-0 kubenswrapper[13984]: I0312 12:42:01.472817 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-pod-volumes-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.472928 master-0 kubenswrapper[13984]: I0312 12:42:01.472687 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-sys\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473003 master-0 kubenswrapper[13984]: I0312 12:42:01.472974 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-run-udev\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473153 master-0 kubenswrapper[13984]: I0312 12:42:01.473134 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-device-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473270 master-0 kubenswrapper[13984]: I0312 12:42:01.473249 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-registration-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473549 master-0 kubenswrapper[13984]: I0312 12:42:01.473265 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-csi-plugin-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473620 master-0 kubenswrapper[13984]: I0312 12:42:01.473554 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-registration-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473669 master-0 kubenswrapper[13984]: I0312 12:42:01.473021 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-run-udev\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473669 master-0 kubenswrapper[13984]: I0312 12:42:01.473035 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-lvmd-config\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473669 master-0 kubenswrapper[13984]: I0312 12:42:01.473416 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-device-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473800 master-0 kubenswrapper[13984]: I0312 12:42:01.473314 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-pod-volumes-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.473894 master-0 kubenswrapper[13984]: I0312 12:42:01.473874 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-node-plugin-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.474023 master-0 kubenswrapper[13984]: I0312 12:42:01.474006 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-file-lock-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.474215 master-0 kubenswrapper[13984]: I0312 12:42:01.474176 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-node-plugin-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.477496 master-0 kubenswrapper[13984]: I0312 12:42:01.474690 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/9466ac17-5416-449b-9492-e88ff8ff8cca-file-lock-dir\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.485644 master-0 kubenswrapper[13984]: I0312 12:42:01.478503 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/9466ac17-5416-449b-9492-e88ff8ff8cca-metrics-cert\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.492234 master-0 kubenswrapper[13984]: I0312 12:42:01.492107 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7lfh\" (UniqueName: \"kubernetes.io/projected/9466ac17-5416-449b-9492-e88ff8ff8cca-kube-api-access-m7lfh\") pod \"vg-manager-5ngc6\" (UID: \"9466ac17-5416-449b-9492-e88ff8ff8cca\") " pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.559951 master-0 kubenswrapper[13984]: I0312 12:42:01.559891 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:01.989998 master-0 kubenswrapper[13984]: W0312 12:42:01.989898 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9466ac17_5416_449b_9492_e88ff8ff8cca.slice/crio-a6b87e648004eaaaf4662611fa3197f78fea937c22030b7c7c3413b5b4a35617 WatchSource:0}: Error finding container a6b87e648004eaaaf4662611fa3197f78fea937c22030b7c7c3413b5b4a35617: Status 404 returned error can't find the container with id a6b87e648004eaaaf4662611fa3197f78fea937c22030b7c7c3413b5b4a35617 Mar 12 12:42:01.992875 master-0 kubenswrapper[13984]: I0312 12:42:01.992573 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-5ngc6"] Mar 12 12:42:02.462367 master-0 kubenswrapper[13984]: I0312 12:42:02.462283 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5ngc6" event={"ID":"9466ac17-5416-449b-9492-e88ff8ff8cca","Type":"ContainerStarted","Data":"2996bd90fcd57ca6ee363b21eae1d6c07d7da444332b31033cd7f78918dfb30b"} Mar 12 12:42:02.462367 master-0 kubenswrapper[13984]: I0312 12:42:02.462357 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5ngc6" event={"ID":"9466ac17-5416-449b-9492-e88ff8ff8cca","Type":"ContainerStarted","Data":"a6b87e648004eaaaf4662611fa3197f78fea937c22030b7c7c3413b5b4a35617"} Mar 12 12:42:02.489157 master-0 kubenswrapper[13984]: I0312 12:42:02.489069 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-5ngc6" podStartSLOduration=1.489049995 podStartE2EDuration="1.489049995s" podCreationTimestamp="2026-03-12 12:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:42:02.488859789 +0000 UTC m=+1054.686875291" watchObservedRunningTime="2026-03-12 12:42:02.489049995 +0000 UTC m=+1054.687065497" Mar 12 12:42:04.156136 master-0 kubenswrapper[13984]: I0312 12:42:04.155944 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jgcbm" Mar 12 12:42:04.497391 master-0 kubenswrapper[13984]: I0312 12:42:04.497302 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-5ngc6_9466ac17-5416-449b-9492-e88ff8ff8cca/vg-manager/0.log" Mar 12 12:42:04.497637 master-0 kubenswrapper[13984]: I0312 12:42:04.497398 13984 generic.go:334] "Generic (PLEG): container finished" podID="9466ac17-5416-449b-9492-e88ff8ff8cca" containerID="2996bd90fcd57ca6ee363b21eae1d6c07d7da444332b31033cd7f78918dfb30b" exitCode=1 Mar 12 12:42:04.497637 master-0 kubenswrapper[13984]: I0312 12:42:04.497451 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5ngc6" event={"ID":"9466ac17-5416-449b-9492-e88ff8ff8cca","Type":"ContainerDied","Data":"2996bd90fcd57ca6ee363b21eae1d6c07d7da444332b31033cd7f78918dfb30b"} Mar 12 12:42:04.498423 master-0 kubenswrapper[13984]: I0312 12:42:04.498374 13984 scope.go:117] "RemoveContainer" containerID="2996bd90fcd57ca6ee363b21eae1d6c07d7da444332b31033cd7f78918dfb30b" Mar 12 12:42:04.849368 master-0 kubenswrapper[13984]: I0312 12:42:04.849230 13984 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 12 12:42:05.507451 master-0 kubenswrapper[13984]: I0312 12:42:05.507390 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-5ngc6_9466ac17-5416-449b-9492-e88ff8ff8cca/vg-manager/0.log" Mar 12 12:42:05.507451 master-0 kubenswrapper[13984]: I0312 12:42:05.507446 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-5ngc6" event={"ID":"9466ac17-5416-449b-9492-e88ff8ff8cca","Type":"ContainerStarted","Data":"cea82afe21dc82de4060db9b19f6abe4ec3eb5db015881194efa3805ddfd9b68"} Mar 12 12:42:05.729860 master-0 kubenswrapper[13984]: I0312 12:42:05.729727 13984 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-12T12:42:04.849271397Z","Handler":null,"Name":""} Mar 12 12:42:05.732318 master-0 kubenswrapper[13984]: I0312 12:42:05.732285 13984 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 12 12:42:05.732404 master-0 kubenswrapper[13984]: I0312 12:42:05.732343 13984 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 12 12:42:11.419398 master-0 kubenswrapper[13984]: I0312 12:42:11.419304 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7f76bf8d9-qghrl" podUID="5cfd45cf-2239-49a1-887b-e1115fbc5fe3" containerName="console" containerID="cri-o://6b66236ced309d9d764e23b8349415b0dfc74f95cf6030e11271a84a28a86fb1" gracePeriod=15 Mar 12 12:42:11.558869 master-0 kubenswrapper[13984]: I0312 12:42:11.558816 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f76bf8d9-qghrl_5cfd45cf-2239-49a1-887b-e1115fbc5fe3/console/0.log" Mar 12 12:42:11.558869 master-0 kubenswrapper[13984]: I0312 12:42:11.558862 13984 generic.go:334] "Generic (PLEG): container finished" podID="5cfd45cf-2239-49a1-887b-e1115fbc5fe3" containerID="6b66236ced309d9d764e23b8349415b0dfc74f95cf6030e11271a84a28a86fb1" exitCode=2 Mar 12 12:42:11.559146 master-0 kubenswrapper[13984]: I0312 12:42:11.558888 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76bf8d9-qghrl" event={"ID":"5cfd45cf-2239-49a1-887b-e1115fbc5fe3","Type":"ContainerDied","Data":"6b66236ced309d9d764e23b8349415b0dfc74f95cf6030e11271a84a28a86fb1"} Mar 12 12:42:11.561117 master-0 kubenswrapper[13984]: I0312 12:42:11.561084 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:11.565684 master-0 kubenswrapper[13984]: I0312 12:42:11.562678 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:11.942938 master-0 kubenswrapper[13984]: I0312 12:42:11.942883 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f76bf8d9-qghrl_5cfd45cf-2239-49a1-887b-e1115fbc5fe3/console/0.log" Mar 12 12:42:11.943132 master-0 kubenswrapper[13984]: I0312 12:42:11.942951 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.972647 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-config\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.972698 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-serving-cert\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.972741 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-service-ca\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.972919 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-trusted-ca-bundle\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.972939 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvj8h\" (UniqueName: \"kubernetes.io/projected/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-kube-api-access-kvj8h\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.972981 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-oauth-config\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.973014 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-oauth-serving-cert\") pod \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\" (UID: \"5cfd45cf-2239-49a1-887b-e1115fbc5fe3\") " Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.973294 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-config" (OuterVolumeSpecName: "console-config") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.973803 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:42:11.976506 master-0 kubenswrapper[13984]: I0312 12:42:11.974187 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-service-ca" (OuterVolumeSpecName: "service-ca") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:42:11.977023 master-0 kubenswrapper[13984]: I0312 12:42:11.976529 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-kube-api-access-kvj8h" (OuterVolumeSpecName: "kube-api-access-kvj8h") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "kube-api-access-kvj8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:42:11.982804 master-0 kubenswrapper[13984]: I0312 12:42:11.982651 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:42:11.982804 master-0 kubenswrapper[13984]: I0312 12:42:11.982722 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:42:11.984214 master-0 kubenswrapper[13984]: I0312 12:42:11.983958 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5cfd45cf-2239-49a1-887b-e1115fbc5fe3" (UID: "5cfd45cf-2239-49a1-887b-e1115fbc5fe3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:42:12.074832 master-0 kubenswrapper[13984]: I0312 12:42:12.074736 13984 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.074832 master-0 kubenswrapper[13984]: I0312 12:42:12.074790 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvj8h\" (UniqueName: \"kubernetes.io/projected/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-kube-api-access-kvj8h\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.074832 master-0 kubenswrapper[13984]: I0312 12:42:12.074809 13984 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.074832 master-0 kubenswrapper[13984]: I0312 12:42:12.074826 13984 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.074832 master-0 kubenswrapper[13984]: I0312 12:42:12.074846 13984 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.074832 master-0 kubenswrapper[13984]: I0312 12:42:12.074866 13984 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.075758 master-0 kubenswrapper[13984]: I0312 12:42:12.074882 13984 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5cfd45cf-2239-49a1-887b-e1115fbc5fe3-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:12.569849 master-0 kubenswrapper[13984]: I0312 12:42:12.569723 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7f76bf8d9-qghrl_5cfd45cf-2239-49a1-887b-e1115fbc5fe3/console/0.log" Mar 12 12:42:12.570639 master-0 kubenswrapper[13984]: I0312 12:42:12.569952 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7f76bf8d9-qghrl" event={"ID":"5cfd45cf-2239-49a1-887b-e1115fbc5fe3","Type":"ContainerDied","Data":"f742dbc9ecc16f1b8a5bab49d77045dd6343b71a802a443cc30146d026e4e431"} Mar 12 12:42:12.570639 master-0 kubenswrapper[13984]: I0312 12:42:12.570051 13984 scope.go:117] "RemoveContainer" containerID="6b66236ced309d9d764e23b8349415b0dfc74f95cf6030e11271a84a28a86fb1" Mar 12 12:42:12.570639 master-0 kubenswrapper[13984]: I0312 12:42:12.569980 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7f76bf8d9-qghrl" Mar 12 12:42:12.570853 master-0 kubenswrapper[13984]: I0312 12:42:12.570748 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:12.571821 master-0 kubenswrapper[13984]: I0312 12:42:12.571776 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-5ngc6" Mar 12 12:42:12.663484 master-0 kubenswrapper[13984]: I0312 12:42:12.663442 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7f76bf8d9-qghrl"] Mar 12 12:42:12.673510 master-0 kubenswrapper[13984]: I0312 12:42:12.671599 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7f76bf8d9-qghrl"] Mar 12 12:42:13.994129 master-0 kubenswrapper[13984]: I0312 12:42:13.994048 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cfd45cf-2239-49a1-887b-e1115fbc5fe3" path="/var/lib/kubelet/pods/5cfd45cf-2239-49a1-887b-e1115fbc5fe3/volumes" Mar 12 12:42:14.671989 master-0 kubenswrapper[13984]: I0312 12:42:14.671940 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-thptr"] Mar 12 12:42:14.672510 master-0 kubenswrapper[13984]: E0312 12:42:14.672495 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cfd45cf-2239-49a1-887b-e1115fbc5fe3" containerName="console" Mar 12 12:42:14.681602 master-0 kubenswrapper[13984]: I0312 12:42:14.681544 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cfd45cf-2239-49a1-887b-e1115fbc5fe3" containerName="console" Mar 12 12:42:14.682254 master-0 kubenswrapper[13984]: I0312 12:42:14.682236 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cfd45cf-2239-49a1-887b-e1115fbc5fe3" containerName="console" Mar 12 12:42:14.682866 master-0 kubenswrapper[13984]: I0312 12:42:14.682850 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:14.685602 master-0 kubenswrapper[13984]: I0312 12:42:14.685544 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-thptr"] Mar 12 12:42:14.686075 master-0 kubenswrapper[13984]: I0312 12:42:14.686053 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 12 12:42:14.686267 master-0 kubenswrapper[13984]: I0312 12:42:14.686089 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 12 12:42:14.827948 master-0 kubenswrapper[13984]: I0312 12:42:14.827871 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26w9w\" (UniqueName: \"kubernetes.io/projected/f0804546-d6d4-47f9-96e4-464f0095a043-kube-api-access-26w9w\") pod \"openstack-operator-index-thptr\" (UID: \"f0804546-d6d4-47f9-96e4-464f0095a043\") " pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:14.929273 master-0 kubenswrapper[13984]: I0312 12:42:14.929159 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26w9w\" (UniqueName: \"kubernetes.io/projected/f0804546-d6d4-47f9-96e4-464f0095a043-kube-api-access-26w9w\") pod \"openstack-operator-index-thptr\" (UID: \"f0804546-d6d4-47f9-96e4-464f0095a043\") " pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:14.944672 master-0 kubenswrapper[13984]: I0312 12:42:14.944628 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26w9w\" (UniqueName: \"kubernetes.io/projected/f0804546-d6d4-47f9-96e4-464f0095a043-kube-api-access-26w9w\") pod \"openstack-operator-index-thptr\" (UID: \"f0804546-d6d4-47f9-96e4-464f0095a043\") " pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:15.028155 master-0 kubenswrapper[13984]: I0312 12:42:15.028095 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:15.459795 master-0 kubenswrapper[13984]: I0312 12:42:15.459727 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-thptr"] Mar 12 12:42:15.600147 master-0 kubenswrapper[13984]: I0312 12:42:15.600085 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-thptr" event={"ID":"f0804546-d6d4-47f9-96e4-464f0095a043","Type":"ContainerStarted","Data":"d2f610f1eb5d689e0ab9b9c1f350186e62dd1a72fa0c9193e3fcfc9200a6eedb"} Mar 12 12:42:16.609143 master-0 kubenswrapper[13984]: I0312 12:42:16.609081 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-thptr" event={"ID":"f0804546-d6d4-47f9-96e4-464f0095a043","Type":"ContainerStarted","Data":"dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb"} Mar 12 12:42:16.626420 master-0 kubenswrapper[13984]: I0312 12:42:16.626351 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-thptr" podStartSLOduration=1.948095072 podStartE2EDuration="2.6263348s" podCreationTimestamp="2026-03-12 12:42:14 +0000 UTC" firstStartedPulling="2026-03-12 12:42:15.462134508 +0000 UTC m=+1067.660150000" lastFinishedPulling="2026-03-12 12:42:16.140374236 +0000 UTC m=+1068.338389728" observedRunningTime="2026-03-12 12:42:16.625255199 +0000 UTC m=+1068.823270691" watchObservedRunningTime="2026-03-12 12:42:16.6263348 +0000 UTC m=+1068.824350292" Mar 12 12:42:18.797636 master-0 kubenswrapper[13984]: I0312 12:42:18.797583 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-thptr"] Mar 12 12:42:18.798560 master-0 kubenswrapper[13984]: I0312 12:42:18.798528 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-thptr" podUID="f0804546-d6d4-47f9-96e4-464f0095a043" containerName="registry-server" containerID="cri-o://dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb" gracePeriod=2 Mar 12 12:42:19.273841 master-0 kubenswrapper[13984]: I0312 12:42:19.273801 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:19.313317 master-0 kubenswrapper[13984]: I0312 12:42:19.307696 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26w9w\" (UniqueName: \"kubernetes.io/projected/f0804546-d6d4-47f9-96e4-464f0095a043-kube-api-access-26w9w\") pod \"f0804546-d6d4-47f9-96e4-464f0095a043\" (UID: \"f0804546-d6d4-47f9-96e4-464f0095a043\") " Mar 12 12:42:19.320802 master-0 kubenswrapper[13984]: I0312 12:42:19.320760 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0804546-d6d4-47f9-96e4-464f0095a043-kube-api-access-26w9w" (OuterVolumeSpecName: "kube-api-access-26w9w") pod "f0804546-d6d4-47f9-96e4-464f0095a043" (UID: "f0804546-d6d4-47f9-96e4-464f0095a043"). InnerVolumeSpecName "kube-api-access-26w9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:42:19.409705 master-0 kubenswrapper[13984]: I0312 12:42:19.409651 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26w9w\" (UniqueName: \"kubernetes.io/projected/f0804546-d6d4-47f9-96e4-464f0095a043-kube-api-access-26w9w\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:19.415582 master-0 kubenswrapper[13984]: I0312 12:42:19.415520 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-c6wrp"] Mar 12 12:42:19.415984 master-0 kubenswrapper[13984]: E0312 12:42:19.415952 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0804546-d6d4-47f9-96e4-464f0095a043" containerName="registry-server" Mar 12 12:42:19.415984 master-0 kubenswrapper[13984]: I0312 12:42:19.415975 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0804546-d6d4-47f9-96e4-464f0095a043" containerName="registry-server" Mar 12 12:42:19.416189 master-0 kubenswrapper[13984]: I0312 12:42:19.416164 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0804546-d6d4-47f9-96e4-464f0095a043" containerName="registry-server" Mar 12 12:42:19.416806 master-0 kubenswrapper[13984]: I0312 12:42:19.416774 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:19.429784 master-0 kubenswrapper[13984]: I0312 12:42:19.427334 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c6wrp"] Mar 12 12:42:19.512709 master-0 kubenswrapper[13984]: I0312 12:42:19.512579 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97np6\" (UniqueName: \"kubernetes.io/projected/44a9db26-7fba-480a-9b9c-89d2cec10284-kube-api-access-97np6\") pod \"openstack-operator-index-c6wrp\" (UID: \"44a9db26-7fba-480a-9b9c-89d2cec10284\") " pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:19.613727 master-0 kubenswrapper[13984]: I0312 12:42:19.613677 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97np6\" (UniqueName: \"kubernetes.io/projected/44a9db26-7fba-480a-9b9c-89d2cec10284-kube-api-access-97np6\") pod \"openstack-operator-index-c6wrp\" (UID: \"44a9db26-7fba-480a-9b9c-89d2cec10284\") " pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:19.640829 master-0 kubenswrapper[13984]: I0312 12:42:19.640787 13984 generic.go:334] "Generic (PLEG): container finished" podID="f0804546-d6d4-47f9-96e4-464f0095a043" containerID="dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb" exitCode=0 Mar 12 12:42:19.640829 master-0 kubenswrapper[13984]: I0312 12:42:19.640827 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-thptr" event={"ID":"f0804546-d6d4-47f9-96e4-464f0095a043","Type":"ContainerDied","Data":"dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb"} Mar 12 12:42:19.641079 master-0 kubenswrapper[13984]: I0312 12:42:19.640856 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-thptr" event={"ID":"f0804546-d6d4-47f9-96e4-464f0095a043","Type":"ContainerDied","Data":"d2f610f1eb5d689e0ab9b9c1f350186e62dd1a72fa0c9193e3fcfc9200a6eedb"} Mar 12 12:42:19.641079 master-0 kubenswrapper[13984]: I0312 12:42:19.640873 13984 scope.go:117] "RemoveContainer" containerID="dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb" Mar 12 12:42:19.641457 master-0 kubenswrapper[13984]: I0312 12:42:19.641439 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-thptr" Mar 12 12:42:19.644987 master-0 kubenswrapper[13984]: I0312 12:42:19.644950 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97np6\" (UniqueName: \"kubernetes.io/projected/44a9db26-7fba-480a-9b9c-89d2cec10284-kube-api-access-97np6\") pod \"openstack-operator-index-c6wrp\" (UID: \"44a9db26-7fba-480a-9b9c-89d2cec10284\") " pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:19.685865 master-0 kubenswrapper[13984]: I0312 12:42:19.685825 13984 scope.go:117] "RemoveContainer" containerID="dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb" Mar 12 12:42:19.686615 master-0 kubenswrapper[13984]: E0312 12:42:19.686576 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb\": container with ID starting with dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb not found: ID does not exist" containerID="dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb" Mar 12 12:42:19.686689 master-0 kubenswrapper[13984]: I0312 12:42:19.686632 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb"} err="failed to get container status \"dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb\": rpc error: code = NotFound desc = could not find container \"dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb\": container with ID starting with dac126733f4ebada1b1d026caa75ad24388d3b6f632d16b257acd4e9b12bbeeb not found: ID does not exist" Mar 12 12:42:19.706709 master-0 kubenswrapper[13984]: I0312 12:42:19.706645 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-thptr"] Mar 12 12:42:19.714864 master-0 kubenswrapper[13984]: I0312 12:42:19.714815 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-thptr"] Mar 12 12:42:19.738776 master-0 kubenswrapper[13984]: I0312 12:42:19.738742 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:20.006831 master-0 kubenswrapper[13984]: I0312 12:42:20.006768 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0804546-d6d4-47f9-96e4-464f0095a043" path="/var/lib/kubelet/pods/f0804546-d6d4-47f9-96e4-464f0095a043/volumes" Mar 12 12:42:20.168843 master-0 kubenswrapper[13984]: I0312 12:42:20.168805 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-c6wrp"] Mar 12 12:42:20.171115 master-0 kubenswrapper[13984]: W0312 12:42:20.171055 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a9db26_7fba_480a_9b9c_89d2cec10284.slice/crio-32688da7649126e971d309575cc72b5a279aeac9efce68cb94173c2f446c1ed3 WatchSource:0}: Error finding container 32688da7649126e971d309575cc72b5a279aeac9efce68cb94173c2f446c1ed3: Status 404 returned error can't find the container with id 32688da7649126e971d309575cc72b5a279aeac9efce68cb94173c2f446c1ed3 Mar 12 12:42:20.651039 master-0 kubenswrapper[13984]: I0312 12:42:20.650960 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c6wrp" event={"ID":"44a9db26-7fba-480a-9b9c-89d2cec10284","Type":"ContainerStarted","Data":"32688da7649126e971d309575cc72b5a279aeac9efce68cb94173c2f446c1ed3"} Mar 12 12:42:21.661792 master-0 kubenswrapper[13984]: I0312 12:42:21.661038 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-c6wrp" event={"ID":"44a9db26-7fba-480a-9b9c-89d2cec10284","Type":"ContainerStarted","Data":"93a7686b8e495650eb7bddabfc3e7d6c796511c87ed536eb7991c4b7be3aa2ec"} Mar 12 12:42:21.679300 master-0 kubenswrapper[13984]: I0312 12:42:21.679208 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-c6wrp" podStartSLOduration=2.233206881 podStartE2EDuration="2.679188179s" podCreationTimestamp="2026-03-12 12:42:19 +0000 UTC" firstStartedPulling="2026-03-12 12:42:20.175452784 +0000 UTC m=+1072.373468276" lastFinishedPulling="2026-03-12 12:42:20.621434082 +0000 UTC m=+1072.819449574" observedRunningTime="2026-03-12 12:42:21.678214351 +0000 UTC m=+1073.876229853" watchObservedRunningTime="2026-03-12 12:42:21.679188179 +0000 UTC m=+1073.877203701" Mar 12 12:42:29.739669 master-0 kubenswrapper[13984]: I0312 12:42:29.739600 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:29.740298 master-0 kubenswrapper[13984]: I0312 12:42:29.740280 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:29.780170 master-0 kubenswrapper[13984]: I0312 12:42:29.780135 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:30.769092 master-0 kubenswrapper[13984]: I0312 12:42:30.769023 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-c6wrp" Mar 12 12:42:35.290053 master-0 kubenswrapper[13984]: I0312 12:42:35.277919 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g"] Mar 12 12:42:35.290053 master-0 kubenswrapper[13984]: I0312 12:42:35.279556 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.304428 master-0 kubenswrapper[13984]: I0312 12:42:35.303279 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g"] Mar 12 12:42:35.476099 master-0 kubenswrapper[13984]: I0312 12:42:35.475997 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9pw\" (UniqueName: \"kubernetes.io/projected/815a5222-56be-4570-92a1-0cd0d1b7941d-kube-api-access-jt9pw\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.476099 master-0 kubenswrapper[13984]: I0312 12:42:35.476087 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.476389 master-0 kubenswrapper[13984]: I0312 12:42:35.476184 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.578000 master-0 kubenswrapper[13984]: I0312 12:42:35.577804 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.578000 master-0 kubenswrapper[13984]: I0312 12:42:35.577921 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9pw\" (UniqueName: \"kubernetes.io/projected/815a5222-56be-4570-92a1-0cd0d1b7941d-kube-api-access-jt9pw\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.578000 master-0 kubenswrapper[13984]: I0312 12:42:35.577963 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.578545 master-0 kubenswrapper[13984]: I0312 12:42:35.578514 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-util\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.578757 master-0 kubenswrapper[13984]: I0312 12:42:35.578701 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-bundle\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.599907 master-0 kubenswrapper[13984]: I0312 12:42:35.599809 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9pw\" (UniqueName: \"kubernetes.io/projected/815a5222-56be-4570-92a1-0cd0d1b7941d-kube-api-access-jt9pw\") pod \"f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:35.608358 master-0 kubenswrapper[13984]: I0312 12:42:35.608296 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:36.056088 master-0 kubenswrapper[13984]: I0312 12:42:36.056003 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g"] Mar 12 12:42:36.801443 master-0 kubenswrapper[13984]: I0312 12:42:36.801355 13984 generic.go:334] "Generic (PLEG): container finished" podID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerID="b92b0a478d40be1ebb3e40eb854a244803f6ec533f4becffa25016b564d06ba5" exitCode=0 Mar 12 12:42:36.802086 master-0 kubenswrapper[13984]: I0312 12:42:36.801436 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" event={"ID":"815a5222-56be-4570-92a1-0cd0d1b7941d","Type":"ContainerDied","Data":"b92b0a478d40be1ebb3e40eb854a244803f6ec533f4becffa25016b564d06ba5"} Mar 12 12:42:36.802086 master-0 kubenswrapper[13984]: I0312 12:42:36.801525 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" event={"ID":"815a5222-56be-4570-92a1-0cd0d1b7941d","Type":"ContainerStarted","Data":"f5b2ee8b7d6d3a2530c6f4a228694ef0000457cb9281c181a7f575ef80ff065f"} Mar 12 12:42:38.826616 master-0 kubenswrapper[13984]: I0312 12:42:38.826533 13984 generic.go:334] "Generic (PLEG): container finished" podID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerID="c453149659481dd858756e7050044032814eafc821c4556256350e6b960570f3" exitCode=0 Mar 12 12:42:38.826616 master-0 kubenswrapper[13984]: I0312 12:42:38.826616 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" event={"ID":"815a5222-56be-4570-92a1-0cd0d1b7941d","Type":"ContainerDied","Data":"c453149659481dd858756e7050044032814eafc821c4556256350e6b960570f3"} Mar 12 12:42:39.843265 master-0 kubenswrapper[13984]: I0312 12:42:39.843185 13984 generic.go:334] "Generic (PLEG): container finished" podID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerID="e1bd0f931ce8152dc01e4cee125f9db49d6c247ba1b261a367e31457ee6c8249" exitCode=0 Mar 12 12:42:39.843265 master-0 kubenswrapper[13984]: I0312 12:42:39.843244 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" event={"ID":"815a5222-56be-4570-92a1-0cd0d1b7941d","Type":"ContainerDied","Data":"e1bd0f931ce8152dc01e4cee125f9db49d6c247ba1b261a367e31457ee6c8249"} Mar 12 12:42:41.163192 master-0 kubenswrapper[13984]: I0312 12:42:41.163133 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:41.287037 master-0 kubenswrapper[13984]: I0312 12:42:41.286963 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jt9pw\" (UniqueName: \"kubernetes.io/projected/815a5222-56be-4570-92a1-0cd0d1b7941d-kube-api-access-jt9pw\") pod \"815a5222-56be-4570-92a1-0cd0d1b7941d\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " Mar 12 12:42:41.287244 master-0 kubenswrapper[13984]: I0312 12:42:41.287061 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-bundle\") pod \"815a5222-56be-4570-92a1-0cd0d1b7941d\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " Mar 12 12:42:41.287244 master-0 kubenswrapper[13984]: I0312 12:42:41.287163 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-util\") pod \"815a5222-56be-4570-92a1-0cd0d1b7941d\" (UID: \"815a5222-56be-4570-92a1-0cd0d1b7941d\") " Mar 12 12:42:41.287894 master-0 kubenswrapper[13984]: I0312 12:42:41.287837 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-bundle" (OuterVolumeSpecName: "bundle") pod "815a5222-56be-4570-92a1-0cd0d1b7941d" (UID: "815a5222-56be-4570-92a1-0cd0d1b7941d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:42:41.289620 master-0 kubenswrapper[13984]: I0312 12:42:41.289584 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/815a5222-56be-4570-92a1-0cd0d1b7941d-kube-api-access-jt9pw" (OuterVolumeSpecName: "kube-api-access-jt9pw") pod "815a5222-56be-4570-92a1-0cd0d1b7941d" (UID: "815a5222-56be-4570-92a1-0cd0d1b7941d"). InnerVolumeSpecName "kube-api-access-jt9pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:42:41.390050 master-0 kubenswrapper[13984]: I0312 12:42:41.389660 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jt9pw\" (UniqueName: \"kubernetes.io/projected/815a5222-56be-4570-92a1-0cd0d1b7941d-kube-api-access-jt9pw\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:41.390050 master-0 kubenswrapper[13984]: I0312 12:42:41.389965 13984 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:41.408078 master-0 kubenswrapper[13984]: I0312 12:42:41.408029 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-util" (OuterVolumeSpecName: "util") pod "815a5222-56be-4570-92a1-0cd0d1b7941d" (UID: "815a5222-56be-4570-92a1-0cd0d1b7941d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:42:41.491436 master-0 kubenswrapper[13984]: I0312 12:42:41.491368 13984 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/815a5222-56be-4570-92a1-0cd0d1b7941d-util\") on node \"master-0\" DevicePath \"\"" Mar 12 12:42:41.864083 master-0 kubenswrapper[13984]: I0312 12:42:41.864006 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" event={"ID":"815a5222-56be-4570-92a1-0cd0d1b7941d","Type":"ContainerDied","Data":"f5b2ee8b7d6d3a2530c6f4a228694ef0000457cb9281c181a7f575ef80ff065f"} Mar 12 12:42:41.864083 master-0 kubenswrapper[13984]: I0312 12:42:41.864072 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5b2ee8b7d6d3a2530c6f4a228694ef0000457cb9281c181a7f575ef80ff065f" Mar 12 12:42:41.864384 master-0 kubenswrapper[13984]: I0312 12:42:41.864162 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g" Mar 12 12:42:48.499518 master-0 kubenswrapper[13984]: I0312 12:42:48.499352 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk"] Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: E0312 12:42:48.499726 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="util" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: I0312 12:42:48.499741 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="util" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: E0312 12:42:48.499773 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="extract" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: I0312 12:42:48.499781 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="extract" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: E0312 12:42:48.499801 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="pull" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: I0312 12:42:48.499808 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="pull" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: I0312 12:42:48.499979 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="815a5222-56be-4570-92a1-0cd0d1b7941d" containerName="extract" Mar 12 12:42:48.501054 master-0 kubenswrapper[13984]: I0312 12:42:48.500787 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:42:48.546843 master-0 kubenswrapper[13984]: I0312 12:42:48.535470 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk"] Mar 12 12:42:48.615510 master-0 kubenswrapper[13984]: I0312 12:42:48.612583 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sjqh\" (UniqueName: \"kubernetes.io/projected/a3bf35a3-098b-458a-9482-e7946bec4917-kube-api-access-4sjqh\") pod \"openstack-operator-controller-init-65b9994cf8-fg2sk\" (UID: \"a3bf35a3-098b-458a-9482-e7946bec4917\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:42:48.713749 master-0 kubenswrapper[13984]: I0312 12:42:48.713688 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sjqh\" (UniqueName: \"kubernetes.io/projected/a3bf35a3-098b-458a-9482-e7946bec4917-kube-api-access-4sjqh\") pod \"openstack-operator-controller-init-65b9994cf8-fg2sk\" (UID: \"a3bf35a3-098b-458a-9482-e7946bec4917\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:42:48.741404 master-0 kubenswrapper[13984]: I0312 12:42:48.741350 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sjqh\" (UniqueName: \"kubernetes.io/projected/a3bf35a3-098b-458a-9482-e7946bec4917-kube-api-access-4sjqh\") pod \"openstack-operator-controller-init-65b9994cf8-fg2sk\" (UID: \"a3bf35a3-098b-458a-9482-e7946bec4917\") " pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:42:48.817249 master-0 kubenswrapper[13984]: I0312 12:42:48.817123 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:42:49.319622 master-0 kubenswrapper[13984]: I0312 12:42:49.319554 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk"] Mar 12 12:42:49.320794 master-0 kubenswrapper[13984]: W0312 12:42:49.320760 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3bf35a3_098b_458a_9482_e7946bec4917.slice/crio-83158910aa72b16d1a4c443be1b4e5cf52daf24a8de894f1a4b5eca37fb7cdad WatchSource:0}: Error finding container 83158910aa72b16d1a4c443be1b4e5cf52daf24a8de894f1a4b5eca37fb7cdad: Status 404 returned error can't find the container with id 83158910aa72b16d1a4c443be1b4e5cf52daf24a8de894f1a4b5eca37fb7cdad Mar 12 12:42:49.959450 master-0 kubenswrapper[13984]: I0312 12:42:49.959381 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" event={"ID":"a3bf35a3-098b-458a-9482-e7946bec4917","Type":"ContainerStarted","Data":"83158910aa72b16d1a4c443be1b4e5cf52daf24a8de894f1a4b5eca37fb7cdad"} Mar 12 12:42:53.992618 master-0 kubenswrapper[13984]: I0312 12:42:53.992567 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" event={"ID":"a3bf35a3-098b-458a-9482-e7946bec4917","Type":"ContainerStarted","Data":"78edcec32a8677dafdf100026826b0d4f3c19c68463607d008f5530c382453d0"} Mar 12 12:42:53.993197 master-0 kubenswrapper[13984]: I0312 12:42:53.993181 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:42:54.028877 master-0 kubenswrapper[13984]: I0312 12:42:54.028807 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" podStartSLOduration=1.9167162260000001 podStartE2EDuration="6.028786297s" podCreationTimestamp="2026-03-12 12:42:48 +0000 UTC" firstStartedPulling="2026-03-12 12:42:49.323594503 +0000 UTC m=+1101.521609995" lastFinishedPulling="2026-03-12 12:42:53.435664584 +0000 UTC m=+1105.633680066" observedRunningTime="2026-03-12 12:42:54.026018028 +0000 UTC m=+1106.224033530" watchObservedRunningTime="2026-03-12 12:42:54.028786297 +0000 UTC m=+1106.226801789" Mar 12 12:42:58.822169 master-0 kubenswrapper[13984]: I0312 12:42:58.822081 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-65b9994cf8-fg2sk" Mar 12 12:43:19.337125 master-0 kubenswrapper[13984]: I0312 12:43:19.333532 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl"] Mar 12 12:43:19.337125 master-0 kubenswrapper[13984]: I0312 12:43:19.335192 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:19.370752 master-0 kubenswrapper[13984]: I0312 12:43:19.364651 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl"] Mar 12 12:43:19.400856 master-0 kubenswrapper[13984]: I0312 12:43:19.400817 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x"] Mar 12 12:43:19.401862 master-0 kubenswrapper[13984]: I0312 12:43:19.401840 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:19.423835 master-0 kubenswrapper[13984]: I0312 12:43:19.423781 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t8hl\" (UniqueName: \"kubernetes.io/projected/e7256eef-5c42-4470-98da-e9ccc9d0fa7c-kube-api-access-4t8hl\") pod \"barbican-operator-controller-manager-677bd678f7-htjwl\" (UID: \"e7256eef-5c42-4470-98da-e9ccc9d0fa7c\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:19.424032 master-0 kubenswrapper[13984]: I0312 12:43:19.423885 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gnvg\" (UniqueName: \"kubernetes.io/projected/fe570816-5558-444f-8bd8-6b5a75a80f77-kube-api-access-5gnvg\") pod \"cinder-operator-controller-manager-984cd4dcf-h626x\" (UID: \"fe570816-5558-444f-8bd8-6b5a75a80f77\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:19.460119 master-0 kubenswrapper[13984]: I0312 12:43:19.460009 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x"] Mar 12 12:43:19.480091 master-0 kubenswrapper[13984]: I0312 12:43:19.480044 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd"] Mar 12 12:43:19.481383 master-0 kubenswrapper[13984]: I0312 12:43:19.481365 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:19.505351 master-0 kubenswrapper[13984]: I0312 12:43:19.505310 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg"] Mar 12 12:43:19.506469 master-0 kubenswrapper[13984]: I0312 12:43:19.506447 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:19.524683 master-0 kubenswrapper[13984]: I0312 12:43:19.524652 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd"] Mar 12 12:43:19.525672 master-0 kubenswrapper[13984]: I0312 12:43:19.525654 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t8hl\" (UniqueName: \"kubernetes.io/projected/e7256eef-5c42-4470-98da-e9ccc9d0fa7c-kube-api-access-4t8hl\") pod \"barbican-operator-controller-manager-677bd678f7-htjwl\" (UID: \"e7256eef-5c42-4470-98da-e9ccc9d0fa7c\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:19.525792 master-0 kubenswrapper[13984]: I0312 12:43:19.525777 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gnvg\" (UniqueName: \"kubernetes.io/projected/fe570816-5558-444f-8bd8-6b5a75a80f77-kube-api-access-5gnvg\") pod \"cinder-operator-controller-manager-984cd4dcf-h626x\" (UID: \"fe570816-5558-444f-8bd8-6b5a75a80f77\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:19.525886 master-0 kubenswrapper[13984]: I0312 12:43:19.525873 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z74t5\" (UniqueName: \"kubernetes.io/projected/f6fe9677-4309-41ac-b5dc-a6e7c95af7e8-kube-api-access-z74t5\") pod \"glance-operator-controller-manager-5964f64c48-jzwpg\" (UID: \"f6fe9677-4309-41ac-b5dc-a6e7c95af7e8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:19.525979 master-0 kubenswrapper[13984]: I0312 12:43:19.525967 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rvvq\" (UniqueName: \"kubernetes.io/projected/225a2621-3a81-48d7-bf57-8cb7355e9acf-kube-api-access-2rvvq\") pod \"designate-operator-controller-manager-66d56f6ff4-c9cfd\" (UID: \"225a2621-3a81-48d7-bf57-8cb7355e9acf\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:19.549736 master-0 kubenswrapper[13984]: I0312 12:43:19.549697 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg"] Mar 12 12:43:19.594438 master-0 kubenswrapper[13984]: I0312 12:43:19.594198 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gnvg\" (UniqueName: \"kubernetes.io/projected/fe570816-5558-444f-8bd8-6b5a75a80f77-kube-api-access-5gnvg\") pod \"cinder-operator-controller-manager-984cd4dcf-h626x\" (UID: \"fe570816-5558-444f-8bd8-6b5a75a80f77\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:19.610203 master-0 kubenswrapper[13984]: I0312 12:43:19.608246 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t8hl\" (UniqueName: \"kubernetes.io/projected/e7256eef-5c42-4470-98da-e9ccc9d0fa7c-kube-api-access-4t8hl\") pod \"barbican-operator-controller-manager-677bd678f7-htjwl\" (UID: \"e7256eef-5c42-4470-98da-e9ccc9d0fa7c\") " pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:19.611319 master-0 kubenswrapper[13984]: I0312 12:43:19.610965 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl"] Mar 12 12:43:19.612094 master-0 kubenswrapper[13984]: I0312 12:43:19.612066 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:19.631991 master-0 kubenswrapper[13984]: I0312 12:43:19.630450 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mntxj\" (UniqueName: \"kubernetes.io/projected/8e479d11-fd58-4636-9410-297bb6d4f88f-kube-api-access-mntxj\") pod \"heat-operator-controller-manager-77b6666d85-g5rdl\" (UID: \"8e479d11-fd58-4636-9410-297bb6d4f88f\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:19.632702 master-0 kubenswrapper[13984]: I0312 12:43:19.632222 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z74t5\" (UniqueName: \"kubernetes.io/projected/f6fe9677-4309-41ac-b5dc-a6e7c95af7e8-kube-api-access-z74t5\") pod \"glance-operator-controller-manager-5964f64c48-jzwpg\" (UID: \"f6fe9677-4309-41ac-b5dc-a6e7c95af7e8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:19.632702 master-0 kubenswrapper[13984]: I0312 12:43:19.632327 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rvvq\" (UniqueName: \"kubernetes.io/projected/225a2621-3a81-48d7-bf57-8cb7355e9acf-kube-api-access-2rvvq\") pod \"designate-operator-controller-manager-66d56f6ff4-c9cfd\" (UID: \"225a2621-3a81-48d7-bf57-8cb7355e9acf\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:19.646216 master-0 kubenswrapper[13984]: I0312 12:43:19.645530 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl"] Mar 12 12:43:19.657008 master-0 kubenswrapper[13984]: I0312 12:43:19.655953 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965"] Mar 12 12:43:19.657657 master-0 kubenswrapper[13984]: I0312 12:43:19.657627 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:19.677092 master-0 kubenswrapper[13984]: I0312 12:43:19.673739 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965"] Mar 12 12:43:19.677092 master-0 kubenswrapper[13984]: I0312 12:43:19.675688 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rvvq\" (UniqueName: \"kubernetes.io/projected/225a2621-3a81-48d7-bf57-8cb7355e9acf-kube-api-access-2rvvq\") pod \"designate-operator-controller-manager-66d56f6ff4-c9cfd\" (UID: \"225a2621-3a81-48d7-bf57-8cb7355e9acf\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:19.696097 master-0 kubenswrapper[13984]: I0312 12:43:19.694381 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z74t5\" (UniqueName: \"kubernetes.io/projected/f6fe9677-4309-41ac-b5dc-a6e7c95af7e8-kube-api-access-z74t5\") pod \"glance-operator-controller-manager-5964f64c48-jzwpg\" (UID: \"f6fe9677-4309-41ac-b5dc-a6e7c95af7e8\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:19.703141 master-0 kubenswrapper[13984]: I0312 12:43:19.703030 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:19.706831 master-0 kubenswrapper[13984]: I0312 12:43:19.706784 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg"] Mar 12 12:43:19.708233 master-0 kubenswrapper[13984]: I0312 12:43:19.708212 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:19.721491 master-0 kubenswrapper[13984]: I0312 12:43:19.721419 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q"] Mar 12 12:43:19.723048 master-0 kubenswrapper[13984]: I0312 12:43:19.723021 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:19.723600 master-0 kubenswrapper[13984]: I0312 12:43:19.723463 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 12 12:43:19.734602 master-0 kubenswrapper[13984]: I0312 12:43:19.732112 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:19.743304 master-0 kubenswrapper[13984]: I0312 12:43:19.742487 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:19.743304 master-0 kubenswrapper[13984]: I0312 12:43:19.742555 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88flp\" (UniqueName: \"kubernetes.io/projected/b6d55477-329a-4fe6-9af5-a1146ee8844e-kube-api-access-88flp\") pod \"horizon-operator-controller-manager-6d9d6b584d-9t965\" (UID: \"b6d55477-329a-4fe6-9af5-a1146ee8844e\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:19.743304 master-0 kubenswrapper[13984]: I0312 12:43:19.742901 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mntxj\" (UniqueName: \"kubernetes.io/projected/8e479d11-fd58-4636-9410-297bb6d4f88f-kube-api-access-mntxj\") pod \"heat-operator-controller-manager-77b6666d85-g5rdl\" (UID: \"8e479d11-fd58-4636-9410-297bb6d4f88f\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:19.743304 master-0 kubenswrapper[13984]: I0312 12:43:19.742962 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrtlw\" (UniqueName: \"kubernetes.io/projected/ee56de80-f409-484e-87df-4a4a9b6cf52e-kube-api-access-zrtlw\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:19.763982 master-0 kubenswrapper[13984]: I0312 12:43:19.763248 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg"] Mar 12 12:43:19.805119 master-0 kubenswrapper[13984]: I0312 12:43:19.804365 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mntxj\" (UniqueName: \"kubernetes.io/projected/8e479d11-fd58-4636-9410-297bb6d4f88f-kube-api-access-mntxj\") pod \"heat-operator-controller-manager-77b6666d85-g5rdl\" (UID: \"8e479d11-fd58-4636-9410-297bb6d4f88f\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:19.808179 master-0 kubenswrapper[13984]: I0312 12:43:19.807495 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:19.827398 master-0 kubenswrapper[13984]: I0312 12:43:19.827358 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q"] Mar 12 12:43:19.844421 master-0 kubenswrapper[13984]: I0312 12:43:19.844372 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:19.844421 master-0 kubenswrapper[13984]: I0312 12:43:19.844421 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88flp\" (UniqueName: \"kubernetes.io/projected/b6d55477-329a-4fe6-9af5-a1146ee8844e-kube-api-access-88flp\") pod \"horizon-operator-controller-manager-6d9d6b584d-9t965\" (UID: \"b6d55477-329a-4fe6-9af5-a1146ee8844e\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:19.844740 master-0 kubenswrapper[13984]: E0312 12:43:19.844538 13984 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:19.844740 master-0 kubenswrapper[13984]: E0312 12:43:19.844593 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert podName:ee56de80-f409-484e-87df-4a4a9b6cf52e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:20.344573276 +0000 UTC m=+1132.542588768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert") pod "infra-operator-controller-manager-b8c8d7cc8-6jzjg" (UID: "ee56de80-f409-484e-87df-4a4a9b6cf52e") : secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:19.844740 master-0 kubenswrapper[13984]: I0312 12:43:19.844626 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwd9w\" (UniqueName: \"kubernetes.io/projected/39c85f77-0a7e-4bcc-8f2c-c7801034b477-kube-api-access-vwd9w\") pod \"ironic-operator-controller-manager-6bbb499bbc-vg66q\" (UID: \"39c85f77-0a7e-4bcc-8f2c-c7801034b477\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:19.844740 master-0 kubenswrapper[13984]: I0312 12:43:19.844697 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrtlw\" (UniqueName: \"kubernetes.io/projected/ee56de80-f409-484e-87df-4a4a9b6cf52e-kube-api-access-zrtlw\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:19.865097 master-0 kubenswrapper[13984]: I0312 12:43:19.865046 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:19.867937 master-0 kubenswrapper[13984]: I0312 12:43:19.867873 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4"] Mar 12 12:43:19.872765 master-0 kubenswrapper[13984]: I0312 12:43:19.872105 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88flp\" (UniqueName: \"kubernetes.io/projected/b6d55477-329a-4fe6-9af5-a1146ee8844e-kube-api-access-88flp\") pod \"horizon-operator-controller-manager-6d9d6b584d-9t965\" (UID: \"b6d55477-329a-4fe6-9af5-a1146ee8844e\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:19.872765 master-0 kubenswrapper[13984]: I0312 12:43:19.872347 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:19.887967 master-0 kubenswrapper[13984]: I0312 12:43:19.874499 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrtlw\" (UniqueName: \"kubernetes.io/projected/ee56de80-f409-484e-87df-4a4a9b6cf52e-kube-api-access-zrtlw\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:19.930247 master-0 kubenswrapper[13984]: I0312 12:43:19.930179 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48"] Mar 12 12:43:19.932561 master-0 kubenswrapper[13984]: I0312 12:43:19.932530 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:19.946862 master-0 kubenswrapper[13984]: I0312 12:43:19.946154 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwd9w\" (UniqueName: \"kubernetes.io/projected/39c85f77-0a7e-4bcc-8f2c-c7801034b477-kube-api-access-vwd9w\") pod \"ironic-operator-controller-manager-6bbb499bbc-vg66q\" (UID: \"39c85f77-0a7e-4bcc-8f2c-c7801034b477\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:19.946862 master-0 kubenswrapper[13984]: I0312 12:43:19.946255 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj96m\" (UniqueName: \"kubernetes.io/projected/af967d1d-e790-4f8c-85a0-a3758a3b5f77-kube-api-access-tj96m\") pod \"keystone-operator-controller-manager-684f77d66d-th4r4\" (UID: \"af967d1d-e790-4f8c-85a0-a3758a3b5f77\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:19.961288 master-0 kubenswrapper[13984]: I0312 12:43:19.959534 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4"] Mar 12 12:43:19.964655 master-0 kubenswrapper[13984]: I0312 12:43:19.964397 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:19.989497 master-0 kubenswrapper[13984]: I0312 12:43:19.989145 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwd9w\" (UniqueName: \"kubernetes.io/projected/39c85f77-0a7e-4bcc-8f2c-c7801034b477-kube-api-access-vwd9w\") pod \"ironic-operator-controller-manager-6bbb499bbc-vg66q\" (UID: \"39c85f77-0a7e-4bcc-8f2c-c7801034b477\") " pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:19.996116 master-0 kubenswrapper[13984]: I0312 12:43:19.996072 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48"] Mar 12 12:43:19.996290 master-0 kubenswrapper[13984]: I0312 12:43:19.996128 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2"] Mar 12 12:43:19.999509 master-0 kubenswrapper[13984]: I0312 12:43:19.997276 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:20.003066 master-0 kubenswrapper[13984]: I0312 12:43:20.002384 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2"] Mar 12 12:43:20.012665 master-0 kubenswrapper[13984]: I0312 12:43:20.012104 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt"] Mar 12 12:43:20.013627 master-0 kubenswrapper[13984]: I0312 12:43:20.013281 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:20.050331 master-0 kubenswrapper[13984]: I0312 12:43:20.036813 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt"] Mar 12 12:43:20.050331 master-0 kubenswrapper[13984]: I0312 12:43:20.050044 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:20.062022 master-0 kubenswrapper[13984]: I0312 12:43:20.057638 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvnxm\" (UniqueName: \"kubernetes.io/projected/5d2c9b0f-aecf-4033-8364-d746566a5632-kube-api-access-jvnxm\") pod \"neutron-operator-controller-manager-776c5696bf-7gggt\" (UID: \"5d2c9b0f-aecf-4033-8364-d746566a5632\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:20.062022 master-0 kubenswrapper[13984]: I0312 12:43:20.057785 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6l6b\" (UniqueName: \"kubernetes.io/projected/d78fcdf2-593b-4e94-93f2-5a7091c6c2af-kube-api-access-h6l6b\") pod \"manila-operator-controller-manager-68f45f9d9f-mnp48\" (UID: \"d78fcdf2-593b-4e94-93f2-5a7091c6c2af\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:20.062022 master-0 kubenswrapper[13984]: I0312 12:43:20.057837 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tj96m\" (UniqueName: \"kubernetes.io/projected/af967d1d-e790-4f8c-85a0-a3758a3b5f77-kube-api-access-tj96m\") pod \"keystone-operator-controller-manager-684f77d66d-th4r4\" (UID: \"af967d1d-e790-4f8c-85a0-a3758a3b5f77\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:20.062022 master-0 kubenswrapper[13984]: I0312 12:43:20.057938 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5gl2\" (UniqueName: \"kubernetes.io/projected/c9f726ff-a744-4adb-a508-e144b957f0c9-kube-api-access-k5gl2\") pod \"mariadb-operator-controller-manager-658d4cdd5-mssg2\" (UID: \"c9f726ff-a744-4adb-a508-e144b957f0c9\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:20.078062 master-0 kubenswrapper[13984]: I0312 12:43:20.077626 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4"] Mar 12 12:43:20.109831 master-0 kubenswrapper[13984]: I0312 12:43:20.109719 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:20.121043 master-0 kubenswrapper[13984]: I0312 12:43:20.120976 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj96m\" (UniqueName: \"kubernetes.io/projected/af967d1d-e790-4f8c-85a0-a3758a3b5f77-kube-api-access-tj96m\") pod \"keystone-operator-controller-manager-684f77d66d-th4r4\" (UID: \"af967d1d-e790-4f8c-85a0-a3758a3b5f77\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:20.125954 master-0 kubenswrapper[13984]: I0312 12:43:20.125765 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4"] Mar 12 12:43:20.177867 master-0 kubenswrapper[13984]: I0312 12:43:20.177381 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:20.178126 master-0 kubenswrapper[13984]: I0312 12:43:20.177943 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6l6b\" (UniqueName: \"kubernetes.io/projected/d78fcdf2-593b-4e94-93f2-5a7091c6c2af-kube-api-access-h6l6b\") pod \"manila-operator-controller-manager-68f45f9d9f-mnp48\" (UID: \"d78fcdf2-593b-4e94-93f2-5a7091c6c2af\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:20.179669 master-0 kubenswrapper[13984]: I0312 12:43:20.179401 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5gl2\" (UniqueName: \"kubernetes.io/projected/c9f726ff-a744-4adb-a508-e144b957f0c9-kube-api-access-k5gl2\") pod \"mariadb-operator-controller-manager-658d4cdd5-mssg2\" (UID: \"c9f726ff-a744-4adb-a508-e144b957f0c9\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:20.180361 master-0 kubenswrapper[13984]: I0312 12:43:20.180292 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvnxm\" (UniqueName: \"kubernetes.io/projected/5d2c9b0f-aecf-4033-8364-d746566a5632-kube-api-access-jvnxm\") pod \"neutron-operator-controller-manager-776c5696bf-7gggt\" (UID: \"5d2c9b0f-aecf-4033-8364-d746566a5632\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:20.206132 master-0 kubenswrapper[13984]: I0312 12:43:20.199141 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6l6b\" (UniqueName: \"kubernetes.io/projected/d78fcdf2-593b-4e94-93f2-5a7091c6c2af-kube-api-access-h6l6b\") pod \"manila-operator-controller-manager-68f45f9d9f-mnp48\" (UID: \"d78fcdf2-593b-4e94-93f2-5a7091c6c2af\") " pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:20.206132 master-0 kubenswrapper[13984]: I0312 12:43:20.199764 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh"] Mar 12 12:43:20.228019 master-0 kubenswrapper[13984]: I0312 12:43:20.227958 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:20.257670 master-0 kubenswrapper[13984]: I0312 12:43:20.250301 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh"] Mar 12 12:43:20.274119 master-0 kubenswrapper[13984]: I0312 12:43:20.272052 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:20.274119 master-0 kubenswrapper[13984]: I0312 12:43:20.272682 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s"] Mar 12 12:43:20.283052 master-0 kubenswrapper[13984]: I0312 12:43:20.282390 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmdx\" (UniqueName: \"kubernetes.io/projected/4433ab46-d919-4939-84f7-911505e17f63-kube-api-access-qbmdx\") pod \"nova-operator-controller-manager-569cc54c5-tlxh4\" (UID: \"4433ab46-d919-4939-84f7-911505e17f63\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:20.283052 master-0 kubenswrapper[13984]: I0312 12:43:20.282497 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.284956 master-0 kubenswrapper[13984]: I0312 12:43:20.283462 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgh7c\" (UniqueName: \"kubernetes.io/projected/8f0fd875-5848-456f-945e-bb7cabd5af6e-kube-api-access-wgh7c\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.284956 master-0 kubenswrapper[13984]: I0312 12:43:20.283539 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh67t\" (UniqueName: \"kubernetes.io/projected/90572a74-c9ea-448f-a63a-a6db44987f9e-kube-api-access-zh67t\") pod \"octavia-operator-controller-manager-5f4f55cb5c-ggqzh\" (UID: \"90572a74-c9ea-448f-a63a-a6db44987f9e\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:20.284956 master-0 kubenswrapper[13984]: I0312 12:43:20.283879 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.284956 master-0 kubenswrapper[13984]: I0312 12:43:20.284767 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v"] Mar 12 12:43:20.287767 master-0 kubenswrapper[13984]: I0312 12:43:20.287659 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:20.294505 master-0 kubenswrapper[13984]: I0312 12:43:20.289185 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 12 12:43:20.294505 master-0 kubenswrapper[13984]: I0312 12:43:20.289734 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:20.294505 master-0 kubenswrapper[13984]: I0312 12:43:20.290523 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5gl2\" (UniqueName: \"kubernetes.io/projected/c9f726ff-a744-4adb-a508-e144b957f0c9-kube-api-access-k5gl2\") pod \"mariadb-operator-controller-manager-658d4cdd5-mssg2\" (UID: \"c9f726ff-a744-4adb-a508-e144b957f0c9\") " pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:20.302458 master-0 kubenswrapper[13984]: I0312 12:43:20.302254 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvnxm\" (UniqueName: \"kubernetes.io/projected/5d2c9b0f-aecf-4033-8364-d746566a5632-kube-api-access-jvnxm\") pod \"neutron-operator-controller-manager-776c5696bf-7gggt\" (UID: \"5d2c9b0f-aecf-4033-8364-d746566a5632\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:20.305578 master-0 kubenswrapper[13984]: I0312 12:43:20.303673 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s"] Mar 12 12:43:20.319757 master-0 kubenswrapper[13984]: I0312 12:43:20.319704 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v"] Mar 12 12:43:20.341323 master-0 kubenswrapper[13984]: I0312 12:43:20.341276 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:20.350809 master-0 kubenswrapper[13984]: I0312 12:43:20.348613 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9"] Mar 12 12:43:20.350809 master-0 kubenswrapper[13984]: I0312 12:43:20.349983 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:20.370991 master-0 kubenswrapper[13984]: I0312 12:43:20.370944 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg"] Mar 12 12:43:20.372554 master-0 kubenswrapper[13984]: I0312 12:43:20.372533 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:20.386807 master-0 kubenswrapper[13984]: I0312 12:43:20.386716 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:20.387589 master-0 kubenswrapper[13984]: I0312 12:43:20.387564 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmdx\" (UniqueName: \"kubernetes.io/projected/4433ab46-d919-4939-84f7-911505e17f63-kube-api-access-qbmdx\") pod \"nova-operator-controller-manager-569cc54c5-tlxh4\" (UID: \"4433ab46-d919-4939-84f7-911505e17f63\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:20.387717 master-0 kubenswrapper[13984]: I0312 12:43:20.387699 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6x27c\" (UniqueName: \"kubernetes.io/projected/9877d80a-1dec-4316-9b07-31bef3912af9-kube-api-access-6x27c\") pod \"ovn-operator-controller-manager-bbc5b68f9-8pd8v\" (UID: \"9877d80a-1dec-4316-9b07-31bef3912af9\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:20.387812 master-0 kubenswrapper[13984]: I0312 12:43:20.387742 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg"] Mar 12 12:43:20.387857 master-0 kubenswrapper[13984]: I0312 12:43:20.387783 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: I0312 12:43:20.387938 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2979d\" (UniqueName: \"kubernetes.io/projected/4b9d39c4-0231-4d5e-95ca-332b69685f93-kube-api-access-2979d\") pod \"placement-operator-controller-manager-574d45c66c-5bskg\" (UID: \"4b9d39c4-0231-4d5e-95ca-332b69685f93\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: I0312 12:43:20.388120 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwdl5\" (UniqueName: \"kubernetes.io/projected/f103a96f-c9c5-4f60-8c36-78cb1adafb67-kube-api-access-mwdl5\") pod \"swift-operator-controller-manager-677c674df7-m7zm9\" (UID: \"f103a96f-c9c5-4f60-8c36-78cb1adafb67\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: I0312 12:43:20.388191 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgh7c\" (UniqueName: \"kubernetes.io/projected/8f0fd875-5848-456f-945e-bb7cabd5af6e-kube-api-access-wgh7c\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: I0312 12:43:20.388233 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh67t\" (UniqueName: \"kubernetes.io/projected/90572a74-c9ea-448f-a63a-a6db44987f9e-kube-api-access-zh67t\") pod \"octavia-operator-controller-manager-5f4f55cb5c-ggqzh\" (UID: \"90572a74-c9ea-448f-a63a-a6db44987f9e\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: I0312 12:43:20.388324 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: E0312 12:43:20.388446 13984 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: E0312 12:43:20.388509 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert podName:ee56de80-f409-484e-87df-4a4a9b6cf52e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:21.388489322 +0000 UTC m=+1133.586504814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert") pod "infra-operator-controller-manager-b8c8d7cc8-6jzjg" (UID: "ee56de80-f409-484e-87df-4a4a9b6cf52e") : secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:20.404153 master-0 kubenswrapper[13984]: I0312 12:43:20.400793 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9"] Mar 12 12:43:20.404840 master-0 kubenswrapper[13984]: E0312 12:43:20.404804 13984 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:20.404968 master-0 kubenswrapper[13984]: E0312 12:43:20.404957 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert podName:8f0fd875-5848-456f-945e-bb7cabd5af6e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:20.904930379 +0000 UTC m=+1133.102945871 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" (UID: "8f0fd875-5848-456f-945e-bb7cabd5af6e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:20.414604 master-0 kubenswrapper[13984]: I0312 12:43:20.414551 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg"] Mar 12 12:43:20.417079 master-0 kubenswrapper[13984]: I0312 12:43:20.416965 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh67t\" (UniqueName: \"kubernetes.io/projected/90572a74-c9ea-448f-a63a-a6db44987f9e-kube-api-access-zh67t\") pod \"octavia-operator-controller-manager-5f4f55cb5c-ggqzh\" (UID: \"90572a74-c9ea-448f-a63a-a6db44987f9e\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:20.417079 master-0 kubenswrapper[13984]: I0312 12:43:20.417035 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgh7c\" (UniqueName: \"kubernetes.io/projected/8f0fd875-5848-456f-945e-bb7cabd5af6e-kube-api-access-wgh7c\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.418690 master-0 kubenswrapper[13984]: I0312 12:43:20.418670 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:20.421998 master-0 kubenswrapper[13984]: I0312 12:43:20.419412 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmdx\" (UniqueName: \"kubernetes.io/projected/4433ab46-d919-4939-84f7-911505e17f63-kube-api-access-qbmdx\") pod \"nova-operator-controller-manager-569cc54c5-tlxh4\" (UID: \"4433ab46-d919-4939-84f7-911505e17f63\") " pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:20.430352 master-0 kubenswrapper[13984]: I0312 12:43:20.430297 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg"] Mar 12 12:43:20.443616 master-0 kubenswrapper[13984]: I0312 12:43:20.443542 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:20.444046 master-0 kubenswrapper[13984]: I0312 12:43:20.444008 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25"] Mar 12 12:43:20.446722 master-0 kubenswrapper[13984]: I0312 12:43:20.446691 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:20.457811 master-0 kubenswrapper[13984]: I0312 12:43:20.457556 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25"] Mar 12 12:43:20.487501 master-0 kubenswrapper[13984]: I0312 12:43:20.477136 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:20.499583 master-0 kubenswrapper[13984]: I0312 12:43:20.490964 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wzdz\" (UniqueName: \"kubernetes.io/projected/41dc4bff-40c2-4671-b9d3-2c4c9380d50e-kube-api-access-6wzdz\") pod \"test-operator-controller-manager-5c5cb9c4d7-pbr25\" (UID: \"41dc4bff-40c2-4671-b9d3-2c4c9380d50e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:20.499583 master-0 kubenswrapper[13984]: I0312 12:43:20.491058 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6x27c\" (UniqueName: \"kubernetes.io/projected/9877d80a-1dec-4316-9b07-31bef3912af9-kube-api-access-6x27c\") pod \"ovn-operator-controller-manager-bbc5b68f9-8pd8v\" (UID: \"9877d80a-1dec-4316-9b07-31bef3912af9\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:20.499583 master-0 kubenswrapper[13984]: I0312 12:43:20.491392 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2979d\" (UniqueName: \"kubernetes.io/projected/4b9d39c4-0231-4d5e-95ca-332b69685f93-kube-api-access-2979d\") pod \"placement-operator-controller-manager-574d45c66c-5bskg\" (UID: \"4b9d39c4-0231-4d5e-95ca-332b69685f93\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:20.499583 master-0 kubenswrapper[13984]: I0312 12:43:20.491607 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwdl5\" (UniqueName: \"kubernetes.io/projected/f103a96f-c9c5-4f60-8c36-78cb1adafb67-kube-api-access-mwdl5\") pod \"swift-operator-controller-manager-677c674df7-m7zm9\" (UID: \"f103a96f-c9c5-4f60-8c36-78cb1adafb67\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:20.499583 master-0 kubenswrapper[13984]: I0312 12:43:20.491702 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpsh9\" (UniqueName: \"kubernetes.io/projected/d403dc83-9804-43c8-92a2-065b9029afbd-kube-api-access-zpsh9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-g8bbg\" (UID: \"d403dc83-9804-43c8-92a2-065b9029afbd\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:20.506505 master-0 kubenswrapper[13984]: I0312 12:43:20.506466 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps"] Mar 12 12:43:20.507800 master-0 kubenswrapper[13984]: I0312 12:43:20.507771 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:20.509035 master-0 kubenswrapper[13984]: I0312 12:43:20.508983 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwdl5\" (UniqueName: \"kubernetes.io/projected/f103a96f-c9c5-4f60-8c36-78cb1adafb67-kube-api-access-mwdl5\") pod \"swift-operator-controller-manager-677c674df7-m7zm9\" (UID: \"f103a96f-c9c5-4f60-8c36-78cb1adafb67\") " pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:20.511342 master-0 kubenswrapper[13984]: I0312 12:43:20.511304 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2979d\" (UniqueName: \"kubernetes.io/projected/4b9d39c4-0231-4d5e-95ca-332b69685f93-kube-api-access-2979d\") pod \"placement-operator-controller-manager-574d45c66c-5bskg\" (UID: \"4b9d39c4-0231-4d5e-95ca-332b69685f93\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:20.514670 master-0 kubenswrapper[13984]: I0312 12:43:20.514647 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6x27c\" (UniqueName: \"kubernetes.io/projected/9877d80a-1dec-4316-9b07-31bef3912af9-kube-api-access-6x27c\") pod \"ovn-operator-controller-manager-bbc5b68f9-8pd8v\" (UID: \"9877d80a-1dec-4316-9b07-31bef3912af9\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:20.517078 master-0 kubenswrapper[13984]: I0312 12:43:20.517046 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps"] Mar 12 12:43:20.579703 master-0 kubenswrapper[13984]: I0312 12:43:20.548135 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:20.579703 master-0 kubenswrapper[13984]: I0312 12:43:20.562380 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7"] Mar 12 12:43:20.579703 master-0 kubenswrapper[13984]: I0312 12:43:20.565689 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.579703 master-0 kubenswrapper[13984]: I0312 12:43:20.578821 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 12 12:43:20.579703 master-0 kubenswrapper[13984]: I0312 12:43:20.579049 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.593228 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpsh9\" (UniqueName: \"kubernetes.io/projected/d403dc83-9804-43c8-92a2-065b9029afbd-kube-api-access-zpsh9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-g8bbg\" (UID: \"d403dc83-9804-43c8-92a2-065b9029afbd\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.593315 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wzdz\" (UniqueName: \"kubernetes.io/projected/41dc4bff-40c2-4671-b9d3-2c4c9380d50e-kube-api-access-6wzdz\") pod \"test-operator-controller-manager-5c5cb9c4d7-pbr25\" (UID: \"41dc4bff-40c2-4671-b9d3-2c4c9380d50e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.593348 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.593375 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.593433 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgjvc\" (UniqueName: \"kubernetes.io/projected/e39a6c9f-0c60-41c8-b20f-e619f7d69af5-kube-api-access-zgjvc\") pod \"watcher-operator-controller-manager-6dd88c6f67-cgmps\" (UID: \"e39a6c9f-0c60-41c8-b20f-e619f7d69af5\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.593489 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6rlx\" (UniqueName: \"kubernetes.io/projected/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-kube-api-access-q6rlx\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.600331 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7"] Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.612742 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5"] Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.614030 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.627375 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5"] Mar 12 12:43:20.635208 master-0 kubenswrapper[13984]: I0312 12:43:20.635081 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wzdz\" (UniqueName: \"kubernetes.io/projected/41dc4bff-40c2-4671-b9d3-2c4c9380d50e-kube-api-access-6wzdz\") pod \"test-operator-controller-manager-5c5cb9c4d7-pbr25\" (UID: \"41dc4bff-40c2-4671-b9d3-2c4c9380d50e\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:20.636265 master-0 kubenswrapper[13984]: I0312 12:43:20.636235 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpsh9\" (UniqueName: \"kubernetes.io/projected/d403dc83-9804-43c8-92a2-065b9029afbd-kube-api-access-zpsh9\") pod \"telemetry-operator-controller-manager-6cd66dbd4b-g8bbg\" (UID: \"d403dc83-9804-43c8-92a2-065b9029afbd\") " pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:20.701126 master-0 kubenswrapper[13984]: I0312 12:43:20.700687 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.701126 master-0 kubenswrapper[13984]: I0312 12:43:20.700781 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgjvc\" (UniqueName: \"kubernetes.io/projected/e39a6c9f-0c60-41c8-b20f-e619f7d69af5-kube-api-access-zgjvc\") pod \"watcher-operator-controller-manager-6dd88c6f67-cgmps\" (UID: \"e39a6c9f-0c60-41c8-b20f-e619f7d69af5\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:20.702047 master-0 kubenswrapper[13984]: I0312 12:43:20.701992 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6rlx\" (UniqueName: \"kubernetes.io/projected/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-kube-api-access-q6rlx\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.716638 master-0 kubenswrapper[13984]: E0312 12:43:20.705450 13984 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 12:43:20.716638 master-0 kubenswrapper[13984]: E0312 12:43:20.707865 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:21.207842744 +0000 UTC m=+1133.405858236 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "webhook-server-cert" not found Mar 12 12:43:20.716638 master-0 kubenswrapper[13984]: I0312 12:43:20.707170 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.716638 master-0 kubenswrapper[13984]: E0312 12:43:20.707259 13984 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 12:43:20.716638 master-0 kubenswrapper[13984]: E0312 12:43:20.711135 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:21.211113573 +0000 UTC m=+1133.409129065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "metrics-server-cert" not found Mar 12 12:43:20.754931 master-0 kubenswrapper[13984]: I0312 12:43:20.743732 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgjvc\" (UniqueName: \"kubernetes.io/projected/e39a6c9f-0c60-41c8-b20f-e619f7d69af5-kube-api-access-zgjvc\") pod \"watcher-operator-controller-manager-6dd88c6f67-cgmps\" (UID: \"e39a6c9f-0c60-41c8-b20f-e619f7d69af5\") " pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:20.780408 master-0 kubenswrapper[13984]: I0312 12:43:20.780357 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6rlx\" (UniqueName: \"kubernetes.io/projected/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-kube-api-access-q6rlx\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:20.812955 master-0 kubenswrapper[13984]: I0312 12:43:20.812911 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn4pd\" (UniqueName: \"kubernetes.io/projected/5446d730-d739-4221-98aa-3e35e455cdbb-kube-api-access-sn4pd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6n5w5\" (UID: \"5446d730-d739-4221-98aa-3e35e455cdbb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" Mar 12 12:43:20.884698 master-0 kubenswrapper[13984]: I0312 12:43:20.884512 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:20.927754 master-0 kubenswrapper[13984]: I0312 12:43:20.927594 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:20.927754 master-0 kubenswrapper[13984]: I0312 12:43:20.927699 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn4pd\" (UniqueName: \"kubernetes.io/projected/5446d730-d739-4221-98aa-3e35e455cdbb-kube-api-access-sn4pd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6n5w5\" (UID: \"5446d730-d739-4221-98aa-3e35e455cdbb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" Mar 12 12:43:20.927987 master-0 kubenswrapper[13984]: E0312 12:43:20.927843 13984 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:20.927987 master-0 kubenswrapper[13984]: E0312 12:43:20.927937 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert podName:8f0fd875-5848-456f-945e-bb7cabd5af6e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:21.927919136 +0000 UTC m=+1134.125934628 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" (UID: "8f0fd875-5848-456f-945e-bb7cabd5af6e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:20.934767 master-0 kubenswrapper[13984]: I0312 12:43:20.934626 13984 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:43:20.943985 master-0 kubenswrapper[13984]: I0312 12:43:20.943949 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:20.944586 master-0 kubenswrapper[13984]: I0312 12:43:20.944459 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl"] Mar 12 12:43:20.952746 master-0 kubenswrapper[13984]: I0312 12:43:20.947068 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn4pd\" (UniqueName: \"kubernetes.io/projected/5446d730-d739-4221-98aa-3e35e455cdbb-kube-api-access-sn4pd\") pod \"rabbitmq-cluster-operator-manager-668c99d594-6n5w5\" (UID: \"5446d730-d739-4221-98aa-3e35e455cdbb\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" Mar 12 12:43:20.962009 master-0 kubenswrapper[13984]: I0312 12:43:20.959062 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:20.971090 master-0 kubenswrapper[13984]: I0312 12:43:20.970852 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:20.987015 master-0 kubenswrapper[13984]: I0312 12:43:20.986974 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:21.011190 master-0 kubenswrapper[13984]: I0312 12:43:21.011147 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" Mar 12 12:43:21.251781 master-0 kubenswrapper[13984]: I0312 12:43:21.239425 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:21.251781 master-0 kubenswrapper[13984]: I0312 12:43:21.239536 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:21.251781 master-0 kubenswrapper[13984]: E0312 12:43:21.239635 13984 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 12:43:21.251781 master-0 kubenswrapper[13984]: E0312 12:43:21.239700 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:22.239679151 +0000 UTC m=+1134.437694643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "metrics-server-cert" not found Mar 12 12:43:21.251781 master-0 kubenswrapper[13984]: E0312 12:43:21.239880 13984 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 12:43:21.251781 master-0 kubenswrapper[13984]: E0312 12:43:21.239991 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:22.239965889 +0000 UTC m=+1134.437981481 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "webhook-server-cert" not found Mar 12 12:43:21.284221 master-0 kubenswrapper[13984]: I0312 12:43:21.284150 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" event={"ID":"e7256eef-5c42-4470-98da-e9ccc9d0fa7c","Type":"ContainerStarted","Data":"b03d002733d86dc086053dde2b5ead23deb4e11ec88ffc1ce5a4cd97c1fe9513"} Mar 12 12:43:21.379922 master-0 kubenswrapper[13984]: I0312 12:43:21.378818 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x"] Mar 12 12:43:21.388638 master-0 kubenswrapper[13984]: W0312 12:43:21.388593 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe570816_5558_444f_8bd8_6b5a75a80f77.slice/crio-82953feb7ed3c0e7ea2a46f1b22539042d19303b264ac211395d08cf918fd43a WatchSource:0}: Error finding container 82953feb7ed3c0e7ea2a46f1b22539042d19303b264ac211395d08cf918fd43a: Status 404 returned error can't find the container with id 82953feb7ed3c0e7ea2a46f1b22539042d19303b264ac211395d08cf918fd43a Mar 12 12:43:21.444128 master-0 kubenswrapper[13984]: I0312 12:43:21.444044 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:21.444391 master-0 kubenswrapper[13984]: E0312 12:43:21.444207 13984 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:21.444391 master-0 kubenswrapper[13984]: E0312 12:43:21.444291 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert podName:ee56de80-f409-484e-87df-4a4a9b6cf52e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:23.444270243 +0000 UTC m=+1135.642285735 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert") pod "infra-operator-controller-manager-b8c8d7cc8-6jzjg" (UID: "ee56de80-f409-484e-87df-4a4a9b6cf52e") : secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:21.589772 master-0 kubenswrapper[13984]: W0312 12:43:21.589700 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e479d11_fd58_4636_9410_297bb6d4f88f.slice/crio-3e99951cd439d3881f4049315a6c23266b1e2a90a47c640615d03ad9809bc13d WatchSource:0}: Error finding container 3e99951cd439d3881f4049315a6c23266b1e2a90a47c640615d03ad9809bc13d: Status 404 returned error can't find the container with id 3e99951cd439d3881f4049315a6c23266b1e2a90a47c640615d03ad9809bc13d Mar 12 12:43:21.590181 master-0 kubenswrapper[13984]: W0312 12:43:21.590106 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod225a2621_3a81_48d7_bf57_8cb7355e9acf.slice/crio-2d37286c23a240a49b42e587bfe339f076bb348e86600bce60d969933f29c0e7 WatchSource:0}: Error finding container 2d37286c23a240a49b42e587bfe339f076bb348e86600bce60d969933f29c0e7: Status 404 returned error can't find the container with id 2d37286c23a240a49b42e587bfe339f076bb348e86600bce60d969933f29c0e7 Mar 12 12:43:21.592801 master-0 kubenswrapper[13984]: I0312 12:43:21.592765 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl"] Mar 12 12:43:21.603824 master-0 kubenswrapper[13984]: W0312 12:43:21.603770 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6d55477_329a_4fe6_9af5_a1146ee8844e.slice/crio-daedb4d4da9a3bec15a9a7aa03d56ccd906bbc3735ed578acfc58d439921064a WatchSource:0}: Error finding container daedb4d4da9a3bec15a9a7aa03d56ccd906bbc3735ed578acfc58d439921064a: Status 404 returned error can't find the container with id daedb4d4da9a3bec15a9a7aa03d56ccd906bbc3735ed578acfc58d439921064a Mar 12 12:43:21.608659 master-0 kubenswrapper[13984]: I0312 12:43:21.608610 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd"] Mar 12 12:43:21.611502 master-0 kubenswrapper[13984]: I0312 12:43:21.610877 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965"] Mar 12 12:43:21.814754 master-0 kubenswrapper[13984]: W0312 12:43:21.814702 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9f726ff_a744_4adb_a508_e144b957f0c9.slice/crio-4436e3842d87d7b4a41a9af2ab1dbd359fc286037a9c86c193dc8323f966c772 WatchSource:0}: Error finding container 4436e3842d87d7b4a41a9af2ab1dbd359fc286037a9c86c193dc8323f966c772: Status 404 returned error can't find the container with id 4436e3842d87d7b4a41a9af2ab1dbd359fc286037a9c86c193dc8323f966c772 Mar 12 12:43:21.821155 master-0 kubenswrapper[13984]: I0312 12:43:21.821064 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2"] Mar 12 12:43:21.845973 master-0 kubenswrapper[13984]: I0312 12:43:21.845905 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48"] Mar 12 12:43:21.845973 master-0 kubenswrapper[13984]: I0312 12:43:21.845980 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q"] Mar 12 12:43:21.871106 master-0 kubenswrapper[13984]: I0312 12:43:21.871051 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg"] Mar 12 12:43:21.924594 master-0 kubenswrapper[13984]: W0312 12:43:21.923765 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6fe9677_4309_41ac_b5dc_a6e7c95af7e8.slice/crio-c7932755a604520264ba52de04b505d4a8fd5a59b0dc5056167b2fc2c07f1b3e WatchSource:0}: Error finding container c7932755a604520264ba52de04b505d4a8fd5a59b0dc5056167b2fc2c07f1b3e: Status 404 returned error can't find the container with id c7932755a604520264ba52de04b505d4a8fd5a59b0dc5056167b2fc2c07f1b3e Mar 12 12:43:21.986585 master-0 kubenswrapper[13984]: I0312 12:43:21.982588 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:21.986585 master-0 kubenswrapper[13984]: E0312 12:43:21.982782 13984 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:21.986585 master-0 kubenswrapper[13984]: E0312 12:43:21.982840 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert podName:8f0fd875-5848-456f-945e-bb7cabd5af6e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:23.982825794 +0000 UTC m=+1136.180841286 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" (UID: "8f0fd875-5848-456f-945e-bb7cabd5af6e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:22.288156 master-0 kubenswrapper[13984]: I0312 12:43:22.287999 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:22.288156 master-0 kubenswrapper[13984]: I0312 12:43:22.288126 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:22.288410 master-0 kubenswrapper[13984]: E0312 12:43:22.288199 13984 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 12:43:22.288410 master-0 kubenswrapper[13984]: E0312 12:43:22.288256 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:24.288239417 +0000 UTC m=+1136.486254909 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "metrics-server-cert" not found Mar 12 12:43:22.288410 master-0 kubenswrapper[13984]: E0312 12:43:22.288375 13984 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 12:43:22.288410 master-0 kubenswrapper[13984]: E0312 12:43:22.288401 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:24.288394001 +0000 UTC m=+1136.486409483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "webhook-server-cert" not found Mar 12 12:43:22.299709 master-0 kubenswrapper[13984]: I0312 12:43:22.299644 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" event={"ID":"8e479d11-fd58-4636-9410-297bb6d4f88f","Type":"ContainerStarted","Data":"3e99951cd439d3881f4049315a6c23266b1e2a90a47c640615d03ad9809bc13d"} Mar 12 12:43:22.304270 master-0 kubenswrapper[13984]: I0312 12:43:22.304228 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" event={"ID":"fe570816-5558-444f-8bd8-6b5a75a80f77","Type":"ContainerStarted","Data":"82953feb7ed3c0e7ea2a46f1b22539042d19303b264ac211395d08cf918fd43a"} Mar 12 12:43:22.305603 master-0 kubenswrapper[13984]: I0312 12:43:22.305549 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" event={"ID":"b6d55477-329a-4fe6-9af5-a1146ee8844e","Type":"ContainerStarted","Data":"daedb4d4da9a3bec15a9a7aa03d56ccd906bbc3735ed578acfc58d439921064a"} Mar 12 12:43:22.309890 master-0 kubenswrapper[13984]: I0312 12:43:22.309862 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" event={"ID":"c9f726ff-a744-4adb-a508-e144b957f0c9","Type":"ContainerStarted","Data":"4436e3842d87d7b4a41a9af2ab1dbd359fc286037a9c86c193dc8323f966c772"} Mar 12 12:43:22.311503 master-0 kubenswrapper[13984]: I0312 12:43:22.311469 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" event={"ID":"225a2621-3a81-48d7-bf57-8cb7355e9acf","Type":"ContainerStarted","Data":"2d37286c23a240a49b42e587bfe339f076bb348e86600bce60d969933f29c0e7"} Mar 12 12:43:22.312871 master-0 kubenswrapper[13984]: I0312 12:43:22.312843 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" event={"ID":"d78fcdf2-593b-4e94-93f2-5a7091c6c2af","Type":"ContainerStarted","Data":"3566ff2c578ac7e5d85df2f7266f6e0d9133d952d2ff363b203724ca11f6f528"} Mar 12 12:43:22.314543 master-0 kubenswrapper[13984]: I0312 12:43:22.314522 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" event={"ID":"f6fe9677-4309-41ac-b5dc-a6e7c95af7e8","Type":"ContainerStarted","Data":"c7932755a604520264ba52de04b505d4a8fd5a59b0dc5056167b2fc2c07f1b3e"} Mar 12 12:43:22.318379 master-0 kubenswrapper[13984]: I0312 12:43:22.318247 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" event={"ID":"39c85f77-0a7e-4bcc-8f2c-c7801034b477","Type":"ContainerStarted","Data":"e6ab601efdbf0dcb872efd05f007ace93e1673e70cd38f563fbc530638c16d37"} Mar 12 12:43:22.683584 master-0 kubenswrapper[13984]: I0312 12:43:22.682008 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5"] Mar 12 12:43:22.707046 master-0 kubenswrapper[13984]: W0312 12:43:22.706143 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5446d730_d739_4221_98aa_3e35e455cdbb.slice/crio-ed95803fc27933d414e4d6496f205207e60a126ce871d391cdebe4f0710685ea WatchSource:0}: Error finding container ed95803fc27933d414e4d6496f205207e60a126ce871d391cdebe4f0710685ea: Status 404 returned error can't find the container with id ed95803fc27933d414e4d6496f205207e60a126ce871d391cdebe4f0710685ea Mar 12 12:43:22.707046 master-0 kubenswrapper[13984]: I0312 12:43:22.706191 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4"] Mar 12 12:43:22.724757 master-0 kubenswrapper[13984]: I0312 12:43:22.720884 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25"] Mar 12 12:43:22.743511 master-0 kubenswrapper[13984]: I0312 12:43:22.743433 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh"] Mar 12 12:43:22.759397 master-0 kubenswrapper[13984]: I0312 12:43:22.757979 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg"] Mar 12 12:43:22.772079 master-0 kubenswrapper[13984]: E0312 12:43:22.770900 13984 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tj96m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-th4r4_openstack-operators(af967d1d-e790-4f8c-85a0-a3758a3b5f77): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 12:43:22.772367 master-0 kubenswrapper[13984]: E0312 12:43:22.772208 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" podUID="af967d1d-e790-4f8c-85a0-a3758a3b5f77" Mar 12 12:43:22.779669 master-0 kubenswrapper[13984]: E0312 12:43:22.779593 13984 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6x27c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-8pd8v_openstack-operators(9877d80a-1dec-4316-9b07-31bef3912af9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 12 12:43:22.781287 master-0 kubenswrapper[13984]: E0312 12:43:22.780816 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" podUID="9877d80a-1dec-4316-9b07-31bef3912af9" Mar 12 12:43:22.851093 master-0 kubenswrapper[13984]: I0312 12:43:22.851028 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg"] Mar 12 12:43:22.869691 master-0 kubenswrapper[13984]: I0312 12:43:22.869634 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt"] Mar 12 12:43:22.887764 master-0 kubenswrapper[13984]: I0312 12:43:22.887705 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4"] Mar 12 12:43:22.900641 master-0 kubenswrapper[13984]: I0312 12:43:22.900554 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9"] Mar 12 12:43:22.909827 master-0 kubenswrapper[13984]: I0312 12:43:22.909756 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v"] Mar 12 12:43:22.935956 master-0 kubenswrapper[13984]: I0312 12:43:22.935793 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps"] Mar 12 12:43:23.333629 master-0 kubenswrapper[13984]: I0312 12:43:23.333573 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" event={"ID":"4b9d39c4-0231-4d5e-95ca-332b69685f93","Type":"ContainerStarted","Data":"1ce4876a0e2fdcce27dca74318ed43b38ff19962926bb3f20ce2fac255a67c94"} Mar 12 12:43:23.336534 master-0 kubenswrapper[13984]: I0312 12:43:23.336509 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" event={"ID":"4433ab46-d919-4939-84f7-911505e17f63","Type":"ContainerStarted","Data":"629338ee250adbda454196567617be8e2cb26e1c87b98f261078cd9ebda99ecf"} Mar 12 12:43:23.338976 master-0 kubenswrapper[13984]: I0312 12:43:23.338953 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" event={"ID":"90572a74-c9ea-448f-a63a-a6db44987f9e","Type":"ContainerStarted","Data":"95e7e63a4c0eccc6224797e0ddd89bdd8fd6a83ecea9075beb3d6324ee7dd040"} Mar 12 12:43:23.340169 master-0 kubenswrapper[13984]: I0312 12:43:23.340148 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" event={"ID":"af967d1d-e790-4f8c-85a0-a3758a3b5f77","Type":"ContainerStarted","Data":"ae08d4f3160da5635aba3943221435aee0ae22993fb55eaf19b9865043a583b8"} Mar 12 12:43:23.341508 master-0 kubenswrapper[13984]: E0312 12:43:23.341464 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" podUID="af967d1d-e790-4f8c-85a0-a3758a3b5f77" Mar 12 12:43:23.342744 master-0 kubenswrapper[13984]: I0312 12:43:23.342712 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" event={"ID":"f103a96f-c9c5-4f60-8c36-78cb1adafb67","Type":"ContainerStarted","Data":"9e068eb80616500da566d43c3dfbbac1bd14b95660fd2bc3fd77491bbe621252"} Mar 12 12:43:23.344147 master-0 kubenswrapper[13984]: I0312 12:43:23.344103 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" event={"ID":"5446d730-d739-4221-98aa-3e35e455cdbb","Type":"ContainerStarted","Data":"ed95803fc27933d414e4d6496f205207e60a126ce871d391cdebe4f0710685ea"} Mar 12 12:43:23.345146 master-0 kubenswrapper[13984]: I0312 12:43:23.345089 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" event={"ID":"d403dc83-9804-43c8-92a2-065b9029afbd","Type":"ContainerStarted","Data":"c8300e4f5ec02372e3f8112538b2214ee9781772b179febdd725309b340dbee4"} Mar 12 12:43:23.346105 master-0 kubenswrapper[13984]: I0312 12:43:23.346078 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" event={"ID":"41dc4bff-40c2-4671-b9d3-2c4c9380d50e","Type":"ContainerStarted","Data":"4929e601b5fbf4d9fa9090d7f59ab5fd2fa6fd06f0b018f49aa4e2535a5faa6d"} Mar 12 12:43:23.346992 master-0 kubenswrapper[13984]: I0312 12:43:23.346963 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" event={"ID":"5d2c9b0f-aecf-4033-8364-d746566a5632","Type":"ContainerStarted","Data":"799ae7bd2a08c23df5dd38b4425e2519eaf573f3a6a3f79ec930fae3939e7e72"} Mar 12 12:43:23.349097 master-0 kubenswrapper[13984]: I0312 12:43:23.349058 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" event={"ID":"9877d80a-1dec-4316-9b07-31bef3912af9","Type":"ContainerStarted","Data":"b627e9b730f44721a8cb4a2c93075d22e1cd3de9a13e86ab33e495a9420e8e93"} Mar 12 12:43:23.350814 master-0 kubenswrapper[13984]: E0312 12:43:23.350770 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" podUID="9877d80a-1dec-4316-9b07-31bef3912af9" Mar 12 12:43:23.351550 master-0 kubenswrapper[13984]: I0312 12:43:23.351527 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" event={"ID":"e39a6c9f-0c60-41c8-b20f-e619f7d69af5","Type":"ContainerStarted","Data":"67e20681514a78b44100049aedb6c8f89d19915d20c7faba73c97ef5be14c45a"} Mar 12 12:43:23.515880 master-0 kubenswrapper[13984]: I0312 12:43:23.515646 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:23.515880 master-0 kubenswrapper[13984]: E0312 12:43:23.515806 13984 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:23.516268 master-0 kubenswrapper[13984]: E0312 12:43:23.515967 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert podName:ee56de80-f409-484e-87df-4a4a9b6cf52e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:27.515937373 +0000 UTC m=+1139.713952895 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert") pod "infra-operator-controller-manager-b8c8d7cc8-6jzjg" (UID: "ee56de80-f409-484e-87df-4a4a9b6cf52e") : secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:24.048065 master-0 kubenswrapper[13984]: I0312 12:43:24.048003 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:24.049210 master-0 kubenswrapper[13984]: E0312 12:43:24.049057 13984 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:24.049210 master-0 kubenswrapper[13984]: E0312 12:43:24.049122 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert podName:8f0fd875-5848-456f-945e-bb7cabd5af6e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:28.049106798 +0000 UTC m=+1140.247122290 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" (UID: "8f0fd875-5848-456f-945e-bb7cabd5af6e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:24.353076 master-0 kubenswrapper[13984]: I0312 12:43:24.352256 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:24.353076 master-0 kubenswrapper[13984]: E0312 12:43:24.352388 13984 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 12:43:24.353076 master-0 kubenswrapper[13984]: E0312 12:43:24.352467 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:28.352448044 +0000 UTC m=+1140.550463536 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "webhook-server-cert" not found Mar 12 12:43:24.353076 master-0 kubenswrapper[13984]: I0312 12:43:24.352537 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:24.353076 master-0 kubenswrapper[13984]: E0312 12:43:24.352634 13984 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 12:43:24.353076 master-0 kubenswrapper[13984]: E0312 12:43:24.352675 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:28.35266561 +0000 UTC m=+1140.550681102 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "metrics-server-cert" not found Mar 12 12:43:24.365278 master-0 kubenswrapper[13984]: E0312 12:43:24.364780 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" podUID="af967d1d-e790-4f8c-85a0-a3758a3b5f77" Mar 12 12:43:24.365278 master-0 kubenswrapper[13984]: E0312 12:43:24.364830 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" podUID="9877d80a-1dec-4316-9b07-31bef3912af9" Mar 12 12:43:27.522680 master-0 kubenswrapper[13984]: I0312 12:43:27.522607 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:27.523814 master-0 kubenswrapper[13984]: E0312 12:43:27.523370 13984 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:27.523814 master-0 kubenswrapper[13984]: E0312 12:43:27.523452 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert podName:ee56de80-f409-484e-87df-4a4a9b6cf52e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:35.523431709 +0000 UTC m=+1147.721447271 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert") pod "infra-operator-controller-manager-b8c8d7cc8-6jzjg" (UID: "ee56de80-f409-484e-87df-4a4a9b6cf52e") : secret "infra-operator-webhook-server-cert" not found Mar 12 12:43:28.135423 master-0 kubenswrapper[13984]: I0312 12:43:28.135364 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:28.135768 master-0 kubenswrapper[13984]: E0312 12:43:28.135524 13984 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:28.135768 master-0 kubenswrapper[13984]: E0312 12:43:28.135580 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert podName:8f0fd875-5848-456f-945e-bb7cabd5af6e nodeName:}" failed. No retries permitted until 2026-03-12 12:43:36.135565629 +0000 UTC m=+1148.333581121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert") pod "openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" (UID: "8f0fd875-5848-456f-945e-bb7cabd5af6e") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 12 12:43:28.441171 master-0 kubenswrapper[13984]: I0312 12:43:28.440720 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:28.441171 master-0 kubenswrapper[13984]: I0312 12:43:28.440925 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:28.441171 master-0 kubenswrapper[13984]: E0312 12:43:28.440934 13984 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 12:43:28.441171 master-0 kubenswrapper[13984]: E0312 12:43:28.441020 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:36.440999763 +0000 UTC m=+1148.639015325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "metrics-server-cert" not found Mar 12 12:43:28.441171 master-0 kubenswrapper[13984]: E0312 12:43:28.441047 13984 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 12:43:28.441171 master-0 kubenswrapper[13984]: E0312 12:43:28.441105 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:36.441090545 +0000 UTC m=+1148.639106037 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "webhook-server-cert" not found Mar 12 12:43:35.586086 master-0 kubenswrapper[13984]: I0312 12:43:35.586013 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:35.592115 master-0 kubenswrapper[13984]: I0312 12:43:35.592057 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee56de80-f409-484e-87df-4a4a9b6cf52e-cert\") pod \"infra-operator-controller-manager-b8c8d7cc8-6jzjg\" (UID: \"ee56de80-f409-484e-87df-4a4a9b6cf52e\") " pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:35.766535 master-0 kubenswrapper[13984]: I0312 12:43:35.766447 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:36.196828 master-0 kubenswrapper[13984]: I0312 12:43:36.196704 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:36.200740 master-0 kubenswrapper[13984]: I0312 12:43:36.200704 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f0fd875-5848-456f-945e-bb7cabd5af6e-cert\") pod \"openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s\" (UID: \"8f0fd875-5848-456f-945e-bb7cabd5af6e\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:36.432090 master-0 kubenswrapper[13984]: I0312 12:43:36.431977 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:36.501707 master-0 kubenswrapper[13984]: I0312 12:43:36.501642 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:36.501707 master-0 kubenswrapper[13984]: I0312 12:43:36.501707 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:36.501995 master-0 kubenswrapper[13984]: E0312 12:43:36.501825 13984 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 12 12:43:36.501995 master-0 kubenswrapper[13984]: E0312 12:43:36.501877 13984 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 12 12:43:36.501995 master-0 kubenswrapper[13984]: E0312 12:43:36.501898 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:52.501877181 +0000 UTC m=+1164.699892743 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "metrics-server-cert" not found Mar 12 12:43:36.501995 master-0 kubenswrapper[13984]: E0312 12:43:36.501927 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs podName:f7682f7d-757d-4d0d-90d6-9e7c93a6f069 nodeName:}" failed. No retries permitted until 2026-03-12 12:43:52.501910922 +0000 UTC m=+1164.699926414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs") pod "openstack-operator-controller-manager-7795b46f77-cxdr7" (UID: "f7682f7d-757d-4d0d-90d6-9e7c93a6f069") : secret "webhook-server-cert" not found Mar 12 12:43:40.956570 master-0 kubenswrapper[13984]: I0312 12:43:40.956512 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg"] Mar 12 12:43:40.987237 master-0 kubenswrapper[13984]: W0312 12:43:40.985944 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee56de80_f409_484e_87df_4a4a9b6cf52e.slice/crio-39c9e9da490883c70b754002feed68d2a722ca314ccf3c51f62cdd5caf51074c WatchSource:0}: Error finding container 39c9e9da490883c70b754002feed68d2a722ca314ccf3c51f62cdd5caf51074c: Status 404 returned error can't find the container with id 39c9e9da490883c70b754002feed68d2a722ca314ccf3c51f62cdd5caf51074c Mar 12 12:43:41.133124 master-0 kubenswrapper[13984]: I0312 12:43:41.132072 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s"] Mar 12 12:43:41.267113 master-0 kubenswrapper[13984]: W0312 12:43:41.264872 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f0fd875_5848_456f_945e_bb7cabd5af6e.slice/crio-59475c59ceace6b5aa2c142cfd36b9f808471cb47ac6b07bb0e35e5efb86bf26 WatchSource:0}: Error finding container 59475c59ceace6b5aa2c142cfd36b9f808471cb47ac6b07bb0e35e5efb86bf26: Status 404 returned error can't find the container with id 59475c59ceace6b5aa2c142cfd36b9f808471cb47ac6b07bb0e35e5efb86bf26 Mar 12 12:43:41.602066 master-0 kubenswrapper[13984]: I0312 12:43:41.601602 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" event={"ID":"d403dc83-9804-43c8-92a2-065b9029afbd","Type":"ContainerStarted","Data":"e531092f2c8d6e8ae0e97cdf6eb4a3910eeb75ab5e5ec3e4ad3309a01c5fe930"} Mar 12 12:43:41.602066 master-0 kubenswrapper[13984]: I0312 12:43:41.601700 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:41.633067 master-0 kubenswrapper[13984]: I0312 12:43:41.631250 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" event={"ID":"c9f726ff-a744-4adb-a508-e144b957f0c9","Type":"ContainerStarted","Data":"8d0fde9aa15da023f01d4ab9a2b626f02ac5da95827c657a560a1f56cac88381"} Mar 12 12:43:41.633067 master-0 kubenswrapper[13984]: I0312 12:43:41.632130 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:41.658191 master-0 kubenswrapper[13984]: I0312 12:43:41.655669 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" event={"ID":"b6d55477-329a-4fe6-9af5-a1146ee8844e","Type":"ContainerStarted","Data":"a20b949affb6db8842a2adbd1d59f987fed3f62ed4f6875d89e15931459c1508"} Mar 12 12:43:41.658191 master-0 kubenswrapper[13984]: I0312 12:43:41.655847 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:41.672619 master-0 kubenswrapper[13984]: I0312 12:43:41.671720 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" event={"ID":"225a2621-3a81-48d7-bf57-8cb7355e9acf","Type":"ContainerStarted","Data":"6359aa7a36687dfd19bfca6a21a0dc30ba9ee4c18abcad0c1335e326e1593985"} Mar 12 12:43:41.672843 master-0 kubenswrapper[13984]: I0312 12:43:41.672639 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:41.682550 master-0 kubenswrapper[13984]: I0312 12:43:41.676850 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" event={"ID":"e7256eef-5c42-4470-98da-e9ccc9d0fa7c","Type":"ContainerStarted","Data":"7e35cfb07ed57e0b3a3cac35d150aca590be2c01c220f4ef113d9fa45513d9d6"} Mar 12 12:43:41.682550 master-0 kubenswrapper[13984]: I0312 12:43:41.677468 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:41.688379 master-0 kubenswrapper[13984]: I0312 12:43:41.688201 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" podStartSLOduration=6.633847729 podStartE2EDuration="22.688176882s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.770555629 +0000 UTC m=+1134.968571121" lastFinishedPulling="2026-03-12 12:43:38.824884782 +0000 UTC m=+1151.022900274" observedRunningTime="2026-03-12 12:43:41.660922321 +0000 UTC m=+1153.858937823" watchObservedRunningTime="2026-03-12 12:43:41.688176882 +0000 UTC m=+1153.886192374" Mar 12 12:43:41.723497 master-0 kubenswrapper[13984]: I0312 12:43:41.722464 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" event={"ID":"f103a96f-c9c5-4f60-8c36-78cb1adafb67","Type":"ContainerStarted","Data":"fbff3519e0e332261df78a46ad4e829592e3cbef233f0cc4c67ed914b82097e3"} Mar 12 12:43:41.723497 master-0 kubenswrapper[13984]: I0312 12:43:41.723409 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:41.741604 master-0 kubenswrapper[13984]: I0312 12:43:41.736723 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" event={"ID":"f6fe9677-4309-41ac-b5dc-a6e7c95af7e8","Type":"ContainerStarted","Data":"1a0413a9c60418f20a95a6b803e0dcc7d6d1e769ddd3cb94ddf7067fa6a63037"} Mar 12 12:43:41.741604 master-0 kubenswrapper[13984]: I0312 12:43:41.737242 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:41.763497 master-0 kubenswrapper[13984]: I0312 12:43:41.758150 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" event={"ID":"39c85f77-0a7e-4bcc-8f2c-c7801034b477","Type":"ContainerStarted","Data":"7d1ba48d0a9ba6bd2b68ac06a9ffd876d9a0e05ee4df2421826779507a4a7569"} Mar 12 12:43:41.763497 master-0 kubenswrapper[13984]: I0312 12:43:41.759121 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:41.785500 master-0 kubenswrapper[13984]: I0312 12:43:41.771563 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" event={"ID":"8e479d11-fd58-4636-9410-297bb6d4f88f","Type":"ContainerStarted","Data":"7dec97b28ec04c63d9424051e2ff3d0d40a7cc3034bbd694512d6d047fa684bb"} Mar 12 12:43:41.785500 master-0 kubenswrapper[13984]: I0312 12:43:41.772196 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:41.785500 master-0 kubenswrapper[13984]: I0312 12:43:41.777721 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" event={"ID":"4b9d39c4-0231-4d5e-95ca-332b69685f93","Type":"ContainerStarted","Data":"2ab20ece7bd015a626eadaa7aa9d96aa62dcefc0d543790a10879149d60a6cad"} Mar 12 12:43:41.785500 master-0 kubenswrapper[13984]: I0312 12:43:41.778671 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:41.785500 master-0 kubenswrapper[13984]: I0312 12:43:41.782873 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" event={"ID":"8f0fd875-5848-456f-945e-bb7cabd5af6e","Type":"ContainerStarted","Data":"59475c59ceace6b5aa2c142cfd36b9f808471cb47ac6b07bb0e35e5efb86bf26"} Mar 12 12:43:41.806500 master-0 kubenswrapper[13984]: I0312 12:43:41.795358 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" event={"ID":"af967d1d-e790-4f8c-85a0-a3758a3b5f77","Type":"ContainerStarted","Data":"edd623e64ec3e241914ce716f457d083df9151cf1c092a5525004edef3d93c81"} Mar 12 12:43:41.806500 master-0 kubenswrapper[13984]: I0312 12:43:41.796198 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:41.806500 master-0 kubenswrapper[13984]: I0312 12:43:41.804167 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" event={"ID":"ee56de80-f409-484e-87df-4a4a9b6cf52e","Type":"ContainerStarted","Data":"39c9e9da490883c70b754002feed68d2a722ca314ccf3c51f62cdd5caf51074c"} Mar 12 12:43:41.823508 master-0 kubenswrapper[13984]: I0312 12:43:41.817600 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" event={"ID":"d78fcdf2-593b-4e94-93f2-5a7091c6c2af","Type":"ContainerStarted","Data":"bcd94b357df87d328bf564d6c664ae2379bd37acfcc19afb151cb16bcb3080aa"} Mar 12 12:43:41.823508 master-0 kubenswrapper[13984]: I0312 12:43:41.819375 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:41.845790 master-0 kubenswrapper[13984]: I0312 12:43:41.831964 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" event={"ID":"5d2c9b0f-aecf-4033-8364-d746566a5632","Type":"ContainerStarted","Data":"a20901668f7d45d17f6dae5d9722c9c3de8c026fa66c562d29663e56effd2e37"} Mar 12 12:43:41.845790 master-0 kubenswrapper[13984]: I0312 12:43:41.832945 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:41.846156 master-0 kubenswrapper[13984]: I0312 12:43:41.845961 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" event={"ID":"9877d80a-1dec-4316-9b07-31bef3912af9","Type":"ContainerStarted","Data":"e9a64139eba5195cf9df86d698d58b1374a9df37bcfe461604ae3dad175c28e7"} Mar 12 12:43:41.859506 master-0 kubenswrapper[13984]: I0312 12:43:41.846711 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" podStartSLOduration=10.318783956 podStartE2EDuration="22.846689672s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.817157081 +0000 UTC m=+1134.015172563" lastFinishedPulling="2026-03-12 12:43:34.345062747 +0000 UTC m=+1146.543078279" observedRunningTime="2026-03-12 12:43:41.697004622 +0000 UTC m=+1153.895020114" watchObservedRunningTime="2026-03-12 12:43:41.846689672 +0000 UTC m=+1154.044705164" Mar 12 12:43:41.859506 master-0 kubenswrapper[13984]: I0312 12:43:41.846821 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:41.868564 master-0 kubenswrapper[13984]: I0312 12:43:41.863369 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" podStartSLOduration=5.654174106 podStartE2EDuration="22.863342344s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.615698384 +0000 UTC m=+1133.813713876" lastFinishedPulling="2026-03-12 12:43:38.824866602 +0000 UTC m=+1151.022882114" observedRunningTime="2026-03-12 12:43:41.847316109 +0000 UTC m=+1154.045331601" watchObservedRunningTime="2026-03-12 12:43:41.863342344 +0000 UTC m=+1154.061357846" Mar 12 12:43:41.868564 master-0 kubenswrapper[13984]: I0312 12:43:41.864625 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" event={"ID":"fe570816-5558-444f-8bd8-6b5a75a80f77","Type":"ContainerStarted","Data":"77d2020cf82a3e174785c5420bb1cec0586f415ed94631a4adab6d678d74ba02"} Mar 12 12:43:41.868564 master-0 kubenswrapper[13984]: I0312 12:43:41.865337 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:42.176380 master-0 kubenswrapper[13984]: I0312 12:43:42.173198 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" podStartSLOduration=6.271790197 podStartE2EDuration="23.173166537s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.924795337 +0000 UTC m=+1134.122810829" lastFinishedPulling="2026-03-12 12:43:38.826171677 +0000 UTC m=+1151.024187169" observedRunningTime="2026-03-12 12:43:42.164862751 +0000 UTC m=+1154.362878243" watchObservedRunningTime="2026-03-12 12:43:42.173166537 +0000 UTC m=+1154.371182029" Mar 12 12:43:42.264503 master-0 kubenswrapper[13984]: I0312 12:43:42.262072 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" podStartSLOduration=5.531473341 podStartE2EDuration="23.262054704s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.77941301 +0000 UTC m=+1134.977428502" lastFinishedPulling="2026-03-12 12:43:40.509994373 +0000 UTC m=+1152.708009865" observedRunningTime="2026-03-12 12:43:42.243052127 +0000 UTC m=+1154.441067629" watchObservedRunningTime="2026-03-12 12:43:42.262054704 +0000 UTC m=+1154.460070196" Mar 12 12:43:42.306598 master-0 kubenswrapper[13984]: I0312 12:43:42.301376 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" podStartSLOduration=6.069604859 podStartE2EDuration="23.301355191s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.593201952 +0000 UTC m=+1133.791217444" lastFinishedPulling="2026-03-12 12:43:38.824952284 +0000 UTC m=+1151.022967776" observedRunningTime="2026-03-12 12:43:42.273814973 +0000 UTC m=+1154.471830475" watchObservedRunningTime="2026-03-12 12:43:42.301355191 +0000 UTC m=+1154.499370673" Mar 12 12:43:42.995502 master-0 kubenswrapper[13984]: I0312 12:43:42.991022 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" event={"ID":"90572a74-c9ea-448f-a63a-a6db44987f9e","Type":"ContainerStarted","Data":"b2c1d2b412d45989d2ba58d7d7778fbd6b9a17c49fdd925a460e83493095d3ca"} Mar 12 12:43:42.995502 master-0 kubenswrapper[13984]: I0312 12:43:42.991079 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:43.004581 master-0 kubenswrapper[13984]: I0312 12:43:42.999417 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" event={"ID":"5446d730-d739-4221-98aa-3e35e455cdbb","Type":"ContainerStarted","Data":"cda38940ac06c700785efd5cddc69273df80ebc3e5587fba37a55f520177759d"} Mar 12 12:43:43.029741 master-0 kubenswrapper[13984]: I0312 12:43:43.029645 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" event={"ID":"41dc4bff-40c2-4671-b9d3-2c4c9380d50e","Type":"ContainerStarted","Data":"b46b5336dc406180ae67452031f0eecdde0f6e062c23a04f4d037e400f3c8f82"} Mar 12 12:43:43.030588 master-0 kubenswrapper[13984]: I0312 12:43:43.030565 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:43.051513 master-0 kubenswrapper[13984]: I0312 12:43:43.045998 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" event={"ID":"e39a6c9f-0c60-41c8-b20f-e619f7d69af5","Type":"ContainerStarted","Data":"b01cbe10a5bbb9fc2ac8a45c130da4cdd0905f9cef4453b06a2e0dd5c90e5635"} Mar 12 12:43:43.051513 master-0 kubenswrapper[13984]: I0312 12:43:43.046772 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:43.064163 master-0 kubenswrapper[13984]: I0312 12:43:43.064099 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" event={"ID":"4433ab46-d919-4939-84f7-911505e17f63","Type":"ContainerStarted","Data":"a4d89e0aa037c1bab560dcba834670b41c14ecb16798f6c31466710ff57fa332"} Mar 12 12:43:43.263790 master-0 kubenswrapper[13984]: I0312 12:43:43.263641 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" podStartSLOduration=16.028349091 podStartE2EDuration="24.263625041s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:20.934568707 +0000 UTC m=+1133.132584199" lastFinishedPulling="2026-03-12 12:43:29.169844647 +0000 UTC m=+1141.367860149" observedRunningTime="2026-03-12 12:43:43.261303758 +0000 UTC m=+1155.459319260" watchObservedRunningTime="2026-03-12 12:43:43.263625041 +0000 UTC m=+1155.461640523" Mar 12 12:43:43.267785 master-0 kubenswrapper[13984]: I0312 12:43:43.267720 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" podStartSLOduration=6.462309226 podStartE2EDuration="24.267698892s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.770728374 +0000 UTC m=+1134.968743866" lastFinishedPulling="2026-03-12 12:43:40.57611804 +0000 UTC m=+1152.774133532" observedRunningTime="2026-03-12 12:43:42.610905406 +0000 UTC m=+1154.808920898" watchObservedRunningTime="2026-03-12 12:43:43.267698892 +0000 UTC m=+1155.465714374" Mar 12 12:43:43.325596 master-0 kubenswrapper[13984]: I0312 12:43:43.325504 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" podStartSLOduration=8.256602634 podStartE2EDuration="24.311458301s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.770097607 +0000 UTC m=+1134.968113099" lastFinishedPulling="2026-03-12 12:43:38.824953274 +0000 UTC m=+1151.022968766" observedRunningTime="2026-03-12 12:43:43.297272986 +0000 UTC m=+1155.495288488" watchObservedRunningTime="2026-03-12 12:43:43.311458301 +0000 UTC m=+1155.509473793" Mar 12 12:43:43.390620 master-0 kubenswrapper[13984]: I0312 12:43:43.390536 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" podStartSLOduration=7.445048002 podStartE2EDuration="24.390511301s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.879586178 +0000 UTC m=+1134.077601680" lastFinishedPulling="2026-03-12 12:43:38.825049487 +0000 UTC m=+1151.023064979" observedRunningTime="2026-03-12 12:43:43.347883522 +0000 UTC m=+1155.545899024" watchObservedRunningTime="2026-03-12 12:43:43.390511301 +0000 UTC m=+1155.588526793" Mar 12 12:43:43.418498 master-0 kubenswrapper[13984]: I0312 12:43:43.416908 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" podStartSLOduration=6.780552107 podStartE2EDuration="24.416890898s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.770008954 +0000 UTC m=+1134.968024446" lastFinishedPulling="2026-03-12 12:43:40.406347745 +0000 UTC m=+1152.604363237" observedRunningTime="2026-03-12 12:43:43.375903283 +0000 UTC m=+1155.573918775" watchObservedRunningTime="2026-03-12 12:43:43.416890898 +0000 UTC m=+1155.614906390" Mar 12 12:43:43.428167 master-0 kubenswrapper[13984]: I0312 12:43:43.428066 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" podStartSLOduration=7.3659497720000005 podStartE2EDuration="24.428046281s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.726212304 +0000 UTC m=+1134.924227796" lastFinishedPulling="2026-03-12 12:43:39.788308813 +0000 UTC m=+1151.986324305" observedRunningTime="2026-03-12 12:43:43.425750348 +0000 UTC m=+1155.623765860" watchObservedRunningTime="2026-03-12 12:43:43.428046281 +0000 UTC m=+1155.626061773" Mar 12 12:43:43.474347 master-0 kubenswrapper[13984]: I0312 12:43:43.474259 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" podStartSLOduration=7.593607971 podStartE2EDuration="24.474236257s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.943607488 +0000 UTC m=+1134.141622980" lastFinishedPulling="2026-03-12 12:43:38.824235764 +0000 UTC m=+1151.022251266" observedRunningTime="2026-03-12 12:43:43.452076594 +0000 UTC m=+1155.650092086" watchObservedRunningTime="2026-03-12 12:43:43.474236257 +0000 UTC m=+1155.672251749" Mar 12 12:43:43.508152 master-0 kubenswrapper[13984]: I0312 12:43:43.505606 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" podStartSLOduration=11.755276366 podStartE2EDuration="24.505582689s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.594620261 +0000 UTC m=+1133.792635753" lastFinishedPulling="2026-03-12 12:43:34.344926544 +0000 UTC m=+1146.542942076" observedRunningTime="2026-03-12 12:43:43.482135981 +0000 UTC m=+1155.680151473" watchObservedRunningTime="2026-03-12 12:43:43.505582689 +0000 UTC m=+1155.703598181" Mar 12 12:43:43.530877 master-0 kubenswrapper[13984]: I0312 12:43:43.530737 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" podStartSLOduration=11.578237803 podStartE2EDuration="24.530715382s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:21.392427374 +0000 UTC m=+1133.590442866" lastFinishedPulling="2026-03-12 12:43:34.344904923 +0000 UTC m=+1146.542920445" observedRunningTime="2026-03-12 12:43:43.516027383 +0000 UTC m=+1155.714042875" watchObservedRunningTime="2026-03-12 12:43:43.530715382 +0000 UTC m=+1155.728730874" Mar 12 12:43:43.602506 master-0 kubenswrapper[13984]: I0312 12:43:43.599698 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-6n5w5" podStartSLOduration=5.7977354739999996 podStartE2EDuration="23.599677977s" podCreationTimestamp="2026-03-12 12:43:20 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.708459951 +0000 UTC m=+1134.906475443" lastFinishedPulling="2026-03-12 12:43:40.510402454 +0000 UTC m=+1152.708417946" observedRunningTime="2026-03-12 12:43:43.550396327 +0000 UTC m=+1155.748411819" watchObservedRunningTime="2026-03-12 12:43:43.599677977 +0000 UTC m=+1155.797693469" Mar 12 12:43:43.602506 master-0 kubenswrapper[13984]: I0312 12:43:43.601424 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" podStartSLOduration=6.90609434 podStartE2EDuration="24.601412394s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.710992 +0000 UTC m=+1134.909007492" lastFinishedPulling="2026-03-12 12:43:40.406310044 +0000 UTC m=+1152.604325546" observedRunningTime="2026-03-12 12:43:43.579072007 +0000 UTC m=+1155.777087509" watchObservedRunningTime="2026-03-12 12:43:43.601412394 +0000 UTC m=+1155.799427886" Mar 12 12:43:43.648559 master-0 kubenswrapper[13984]: I0312 12:43:43.648364 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" podStartSLOduration=7.047081013 podStartE2EDuration="24.64834398s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.804284626 +0000 UTC m=+1135.002300118" lastFinishedPulling="2026-03-12 12:43:40.405547593 +0000 UTC m=+1152.603563085" observedRunningTime="2026-03-12 12:43:43.622868827 +0000 UTC m=+1155.820884339" watchObservedRunningTime="2026-03-12 12:43:43.64834398 +0000 UTC m=+1155.846359482" Mar 12 12:43:43.671335 master-0 kubenswrapper[13984]: I0312 12:43:43.670970 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" podStartSLOduration=6.861513968 podStartE2EDuration="24.670946844s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.69737858 +0000 UTC m=+1134.895394072" lastFinishedPulling="2026-03-12 12:43:40.506811456 +0000 UTC m=+1152.704826948" observedRunningTime="2026-03-12 12:43:43.657100058 +0000 UTC m=+1155.855115560" watchObservedRunningTime="2026-03-12 12:43:43.670946844 +0000 UTC m=+1155.868962336" Mar 12 12:43:43.697039 master-0 kubenswrapper[13984]: I0312 12:43:43.694631 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" podStartSLOduration=7.616324627 podStartE2EDuration="24.694605847s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:22.710375363 +0000 UTC m=+1134.908390845" lastFinishedPulling="2026-03-12 12:43:39.788656573 +0000 UTC m=+1151.986672065" observedRunningTime="2026-03-12 12:43:43.693862927 +0000 UTC m=+1155.891878439" watchObservedRunningTime="2026-03-12 12:43:43.694605847 +0000 UTC m=+1155.892621329" Mar 12 12:43:44.076264 master-0 kubenswrapper[13984]: I0312 12:43:44.075061 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:47.102384 master-0 kubenswrapper[13984]: I0312 12:43:47.102329 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" event={"ID":"8f0fd875-5848-456f-945e-bb7cabd5af6e","Type":"ContainerStarted","Data":"2dfe5fb62c3cbef0e26e316c3fda53f580ec2cf7667b3d98c17dcd46bd00747a"} Mar 12 12:43:47.103001 master-0 kubenswrapper[13984]: I0312 12:43:47.102498 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:43:47.104591 master-0 kubenswrapper[13984]: I0312 12:43:47.104538 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" event={"ID":"ee56de80-f409-484e-87df-4a4a9b6cf52e","Type":"ContainerStarted","Data":"8a1e1099421b547b16876703166c67b6583876f49bef3321ac394b72b0b530c2"} Mar 12 12:43:47.104710 master-0 kubenswrapper[13984]: I0312 12:43:47.104681 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:47.148285 master-0 kubenswrapper[13984]: I0312 12:43:47.148179 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" podStartSLOduration=23.064855293 podStartE2EDuration="28.148150803s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:41.285250829 +0000 UTC m=+1153.483266321" lastFinishedPulling="2026-03-12 12:43:46.368546349 +0000 UTC m=+1158.566561831" observedRunningTime="2026-03-12 12:43:47.140436804 +0000 UTC m=+1159.338452306" watchObservedRunningTime="2026-03-12 12:43:47.148150803 +0000 UTC m=+1159.346166335" Mar 12 12:43:49.708178 master-0 kubenswrapper[13984]: I0312 12:43:49.708096 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-677bd678f7-htjwl" Mar 12 12:43:49.736689 master-0 kubenswrapper[13984]: I0312 12:43:49.736582 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-h626x" Mar 12 12:43:49.815882 master-0 kubenswrapper[13984]: I0312 12:43:49.815785 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-c9cfd" Mar 12 12:43:49.870320 master-0 kubenswrapper[13984]: I0312 12:43:49.870207 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-jzwpg" Mar 12 12:43:49.968831 master-0 kubenswrapper[13984]: I0312 12:43:49.968627 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-g5rdl" Mar 12 12:43:50.025126 master-0 kubenswrapper[13984]: I0312 12:43:50.024951 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" podStartSLOduration=25.648761056 podStartE2EDuration="31.024928599s" podCreationTimestamp="2026-03-12 12:43:19 +0000 UTC" firstStartedPulling="2026-03-12 12:43:40.98920225 +0000 UTC m=+1153.187217742" lastFinishedPulling="2026-03-12 12:43:46.365369793 +0000 UTC m=+1158.563385285" observedRunningTime="2026-03-12 12:43:47.168454485 +0000 UTC m=+1159.366469997" watchObservedRunningTime="2026-03-12 12:43:50.024928599 +0000 UTC m=+1162.222944101" Mar 12 12:43:50.054149 master-0 kubenswrapper[13984]: I0312 12:43:50.054087 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-9t965" Mar 12 12:43:50.180944 master-0 kubenswrapper[13984]: I0312 12:43:50.180878 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6bbb499bbc-vg66q" Mar 12 12:43:50.275869 master-0 kubenswrapper[13984]: I0312 12:43:50.275727 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-th4r4" Mar 12 12:43:50.292745 master-0 kubenswrapper[13984]: I0312 12:43:50.292655 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-68f45f9d9f-mnp48" Mar 12 12:43:50.344955 master-0 kubenswrapper[13984]: I0312 12:43:50.344907 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-658d4cdd5-mssg2" Mar 12 12:43:50.389898 master-0 kubenswrapper[13984]: I0312 12:43:50.389828 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-7gggt" Mar 12 12:43:50.422639 master-0 kubenswrapper[13984]: I0312 12:43:50.421350 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-ggqzh" Mar 12 12:43:50.481138 master-0 kubenswrapper[13984]: I0312 12:43:50.481067 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-569cc54c5-tlxh4" Mar 12 12:43:50.552268 master-0 kubenswrapper[13984]: I0312 12:43:50.552115 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-8pd8v" Mar 12 12:43:50.892697 master-0 kubenswrapper[13984]: I0312 12:43:50.892405 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-677c674df7-m7zm9" Mar 12 12:43:50.951421 master-0 kubenswrapper[13984]: I0312 12:43:50.951325 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-5bskg" Mar 12 12:43:50.961566 master-0 kubenswrapper[13984]: I0312 12:43:50.961503 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6cd66dbd4b-g8bbg" Mar 12 12:43:50.973899 master-0 kubenswrapper[13984]: I0312 12:43:50.973825 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-pbr25" Mar 12 12:43:50.991033 master-0 kubenswrapper[13984]: I0312 12:43:50.990986 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6dd88c6f67-cgmps" Mar 12 12:43:52.594367 master-0 kubenswrapper[13984]: I0312 12:43:52.594307 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:52.594367 master-0 kubenswrapper[13984]: I0312 12:43:52.594376 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:52.598435 master-0 kubenswrapper[13984]: I0312 12:43:52.598389 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-metrics-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:52.598588 master-0 kubenswrapper[13984]: I0312 12:43:52.598467 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/f7682f7d-757d-4d0d-90d6-9e7c93a6f069-webhook-certs\") pod \"openstack-operator-controller-manager-7795b46f77-cxdr7\" (UID: \"f7682f7d-757d-4d0d-90d6-9e7c93a6f069\") " pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:52.800906 master-0 kubenswrapper[13984]: I0312 12:43:52.800841 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:53.255411 master-0 kubenswrapper[13984]: I0312 12:43:53.255339 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7"] Mar 12 12:43:53.269676 master-0 kubenswrapper[13984]: W0312 12:43:53.269610 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7682f7d_757d_4d0d_90d6_9e7c93a6f069.slice/crio-5577ffded1e8578ed12c60ab661913121c4f02e3ba8108610849c15c429f2807 WatchSource:0}: Error finding container 5577ffded1e8578ed12c60ab661913121c4f02e3ba8108610849c15c429f2807: Status 404 returned error can't find the container with id 5577ffded1e8578ed12c60ab661913121c4f02e3ba8108610849c15c429f2807 Mar 12 12:43:54.228898 master-0 kubenswrapper[13984]: I0312 12:43:54.228811 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" event={"ID":"f7682f7d-757d-4d0d-90d6-9e7c93a6f069","Type":"ContainerStarted","Data":"583f4b58e078cc84d4c8df4c61b3843d319c721f85e0d5faad763c33f8f79c51"} Mar 12 12:43:54.228898 master-0 kubenswrapper[13984]: I0312 12:43:54.228869 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" event={"ID":"f7682f7d-757d-4d0d-90d6-9e7c93a6f069","Type":"ContainerStarted","Data":"5577ffded1e8578ed12c60ab661913121c4f02e3ba8108610849c15c429f2807"} Mar 12 12:43:54.228898 master-0 kubenswrapper[13984]: I0312 12:43:54.228904 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:43:54.263404 master-0 kubenswrapper[13984]: I0312 12:43:54.263281 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" podStartSLOduration=34.26326268 podStartE2EDuration="34.26326268s" podCreationTimestamp="2026-03-12 12:43:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:43:54.255818598 +0000 UTC m=+1166.453834120" watchObservedRunningTime="2026-03-12 12:43:54.26326268 +0000 UTC m=+1166.461278172" Mar 12 12:43:55.772380 master-0 kubenswrapper[13984]: I0312 12:43:55.772324 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-b8c8d7cc8-6jzjg" Mar 12 12:43:56.439584 master-0 kubenswrapper[13984]: I0312 12:43:56.439441 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s" Mar 12 12:44:02.806913 master-0 kubenswrapper[13984]: I0312 12:44:02.806858 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7795b46f77-cxdr7" Mar 12 12:44:41.027763 master-0 kubenswrapper[13984]: I0312 12:44:41.001399 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cs6gb"] Mar 12 12:44:41.027763 master-0 kubenswrapper[13984]: I0312 12:44:41.003063 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.027763 master-0 kubenswrapper[13984]: I0312 12:44:41.010592 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cs6gb"] Mar 12 12:44:41.031937 master-0 kubenswrapper[13984]: I0312 12:44:41.030555 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 12 12:44:41.031937 master-0 kubenswrapper[13984]: I0312 12:44:41.030819 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 12 12:44:41.033094 master-0 kubenswrapper[13984]: I0312 12:44:41.032188 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 12 12:44:41.076502 master-0 kubenswrapper[13984]: I0312 12:44:41.073712 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffrln\" (UniqueName: \"kubernetes.io/projected/cd429f0b-37cd-4642-b918-2ec7640f17a5-kube-api-access-ffrln\") pod \"dnsmasq-dns-685c76cf85-cs6gb\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.076502 master-0 kubenswrapper[13984]: I0312 12:44:41.074048 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd429f0b-37cd-4642-b918-2ec7640f17a5-config\") pod \"dnsmasq-dns-685c76cf85-cs6gb\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.113500 master-0 kubenswrapper[13984]: I0312 12:44:41.112505 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-5dqn7"] Mar 12 12:44:41.117495 master-0 kubenswrapper[13984]: I0312 12:44:41.114422 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.117495 master-0 kubenswrapper[13984]: I0312 12:44:41.117299 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 12 12:44:41.129727 master-0 kubenswrapper[13984]: I0312 12:44:41.123595 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-5dqn7"] Mar 12 12:44:41.181542 master-0 kubenswrapper[13984]: I0312 12:44:41.175166 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffrln\" (UniqueName: \"kubernetes.io/projected/cd429f0b-37cd-4642-b918-2ec7640f17a5-kube-api-access-ffrln\") pod \"dnsmasq-dns-685c76cf85-cs6gb\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.181542 master-0 kubenswrapper[13984]: I0312 12:44:41.175815 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd429f0b-37cd-4642-b918-2ec7640f17a5-config\") pod \"dnsmasq-dns-685c76cf85-cs6gb\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.181542 master-0 kubenswrapper[13984]: I0312 12:44:41.176733 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd429f0b-37cd-4642-b918-2ec7640f17a5-config\") pod \"dnsmasq-dns-685c76cf85-cs6gb\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.202533 master-0 kubenswrapper[13984]: I0312 12:44:41.192997 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffrln\" (UniqueName: \"kubernetes.io/projected/cd429f0b-37cd-4642-b918-2ec7640f17a5-kube-api-access-ffrln\") pod \"dnsmasq-dns-685c76cf85-cs6gb\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.279492 master-0 kubenswrapper[13984]: I0312 12:44:41.277393 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqzlr\" (UniqueName: \"kubernetes.io/projected/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-kube-api-access-pqzlr\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.279492 master-0 kubenswrapper[13984]: I0312 12:44:41.277512 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.279492 master-0 kubenswrapper[13984]: I0312 12:44:41.277651 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-config\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.373251 master-0 kubenswrapper[13984]: I0312 12:44:41.373165 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:44:41.378946 master-0 kubenswrapper[13984]: I0312 12:44:41.378852 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.379255 master-0 kubenswrapper[13984]: I0312 12:44:41.379074 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-config\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.379255 master-0 kubenswrapper[13984]: I0312 12:44:41.379124 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqzlr\" (UniqueName: \"kubernetes.io/projected/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-kube-api-access-pqzlr\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.379722 master-0 kubenswrapper[13984]: I0312 12:44:41.379661 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-dns-svc\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.380416 master-0 kubenswrapper[13984]: I0312 12:44:41.380351 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-config\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.404591 master-0 kubenswrapper[13984]: I0312 12:44:41.402599 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqzlr\" (UniqueName: \"kubernetes.io/projected/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-kube-api-access-pqzlr\") pod \"dnsmasq-dns-8476fd89bc-5dqn7\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.484960 master-0 kubenswrapper[13984]: I0312 12:44:41.484692 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:44:41.829833 master-0 kubenswrapper[13984]: I0312 12:44:41.821890 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cs6gb"] Mar 12 12:44:41.829833 master-0 kubenswrapper[13984]: W0312 12:44:41.829232 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcd429f0b_37cd_4642_b918_2ec7640f17a5.slice/crio-5e7d73da0befcf725a590af930f149d9dca7faa70a9a677407b1e56419111b1a WatchSource:0}: Error finding container 5e7d73da0befcf725a590af930f149d9dca7faa70a9a677407b1e56419111b1a: Status 404 returned error can't find the container with id 5e7d73da0befcf725a590af930f149d9dca7faa70a9a677407b1e56419111b1a Mar 12 12:44:41.965978 master-0 kubenswrapper[13984]: W0312 12:44:41.965734 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f7e5749_dab3_4f25_98f8_be2a568cdc5d.slice/crio-ee964eb61355e2c08a5ae7acc5a18af5bd22dc5919700fb29539a6b0ef7524ec WatchSource:0}: Error finding container ee964eb61355e2c08a5ae7acc5a18af5bd22dc5919700fb29539a6b0ef7524ec: Status 404 returned error can't find the container with id ee964eb61355e2c08a5ae7acc5a18af5bd22dc5919700fb29539a6b0ef7524ec Mar 12 12:44:41.971873 master-0 kubenswrapper[13984]: I0312 12:44:41.968735 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-5dqn7"] Mar 12 12:44:42.738182 master-0 kubenswrapper[13984]: I0312 12:44:42.738058 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" event={"ID":"cd429f0b-37cd-4642-b918-2ec7640f17a5","Type":"ContainerStarted","Data":"5e7d73da0befcf725a590af930f149d9dca7faa70a9a677407b1e56419111b1a"} Mar 12 12:44:42.739639 master-0 kubenswrapper[13984]: I0312 12:44:42.739580 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" event={"ID":"7f7e5749-dab3-4f25-98f8-be2a568cdc5d","Type":"ContainerStarted","Data":"ee964eb61355e2c08a5ae7acc5a18af5bd22dc5919700fb29539a6b0ef7524ec"} Mar 12 12:44:43.939515 master-0 kubenswrapper[13984]: I0312 12:44:43.932928 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cs6gb"] Mar 12 12:44:43.962388 master-0 kubenswrapper[13984]: I0312 12:44:43.962325 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76849d6659-x82h2"] Mar 12 12:44:43.965130 master-0 kubenswrapper[13984]: I0312 12:44:43.965076 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.034607 master-0 kubenswrapper[13984]: I0312 12:44:44.030588 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-x82h2"] Mar 12 12:44:44.148795 master-0 kubenswrapper[13984]: I0312 12:44:44.147698 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-config\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.148795 master-0 kubenswrapper[13984]: I0312 12:44:44.147868 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqjd4\" (UniqueName: \"kubernetes.io/projected/4c532467-afe9-4d74-a5e0-f8f6ac941467-kube-api-access-dqjd4\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.148795 master-0 kubenswrapper[13984]: I0312 12:44:44.147935 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-dns-svc\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.249740 master-0 kubenswrapper[13984]: I0312 12:44:44.249669 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqjd4\" (UniqueName: \"kubernetes.io/projected/4c532467-afe9-4d74-a5e0-f8f6ac941467-kube-api-access-dqjd4\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.249983 master-0 kubenswrapper[13984]: I0312 12:44:44.249773 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-dns-svc\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.249983 master-0 kubenswrapper[13984]: I0312 12:44:44.249872 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-config\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.256784 master-0 kubenswrapper[13984]: I0312 12:44:44.252183 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-dns-svc\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.256784 master-0 kubenswrapper[13984]: I0312 12:44:44.252950 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-config\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.295218 master-0 kubenswrapper[13984]: I0312 12:44:44.295071 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqjd4\" (UniqueName: \"kubernetes.io/projected/4c532467-afe9-4d74-a5e0-f8f6ac941467-kube-api-access-dqjd4\") pod \"dnsmasq-dns-76849d6659-x82h2\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.351285 master-0 kubenswrapper[13984]: I0312 12:44:44.350984 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:44:44.415654 master-0 kubenswrapper[13984]: I0312 12:44:44.413759 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-5dqn7"] Mar 12 12:44:44.460151 master-0 kubenswrapper[13984]: I0312 12:44:44.460087 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8"] Mar 12 12:44:44.492854 master-0 kubenswrapper[13984]: I0312 12:44:44.492805 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.496663 master-0 kubenswrapper[13984]: I0312 12:44:44.496180 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8"] Mar 12 12:44:44.667496 master-0 kubenswrapper[13984]: I0312 12:44:44.667406 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-config\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.667694 master-0 kubenswrapper[13984]: I0312 12:44:44.667505 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjtvk\" (UniqueName: \"kubernetes.io/projected/75d275ee-71d3-4fef-944d-0a04b037a6fd-kube-api-access-kjtvk\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.667796 master-0 kubenswrapper[13984]: I0312 12:44:44.667737 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.771206 master-0 kubenswrapper[13984]: I0312 12:44:44.771080 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.771618 master-0 kubenswrapper[13984]: I0312 12:44:44.771230 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-config\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.771618 master-0 kubenswrapper[13984]: I0312 12:44:44.771255 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjtvk\" (UniqueName: \"kubernetes.io/projected/75d275ee-71d3-4fef-944d-0a04b037a6fd-kube-api-access-kjtvk\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.772514 master-0 kubenswrapper[13984]: I0312 12:44:44.772414 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-dns-svc\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:44.773030 master-0 kubenswrapper[13984]: I0312 12:44:44.772964 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-config\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:45.566761 master-0 kubenswrapper[13984]: I0312 12:44:45.566705 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjtvk\" (UniqueName: \"kubernetes.io/projected/75d275ee-71d3-4fef-944d-0a04b037a6fd-kube-api-access-kjtvk\") pod \"dnsmasq-dns-6ff8fd9d5c-rz4c8\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:45.707032 master-0 kubenswrapper[13984]: I0312 12:44:45.702433 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-x82h2"] Mar 12 12:44:45.743103 master-0 kubenswrapper[13984]: I0312 12:44:45.743052 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:44:45.794399 master-0 kubenswrapper[13984]: I0312 12:44:45.794331 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-x82h2" event={"ID":"4c532467-afe9-4d74-a5e0-f8f6ac941467","Type":"ContainerStarted","Data":"b8722c033eb2100d9f64f3ae9edb64a595c9c0db134e7252b0b04af6cdecd689"} Mar 12 12:44:48.078597 master-0 kubenswrapper[13984]: I0312 12:44:48.078537 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 12 12:44:48.079842 master-0 kubenswrapper[13984]: I0312 12:44:48.079817 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 12:44:48.085257 master-0 kubenswrapper[13984]: I0312 12:44:48.085195 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 12 12:44:48.092690 master-0 kubenswrapper[13984]: I0312 12:44:48.089979 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 12 12:44:48.108515 master-0 kubenswrapper[13984]: I0312 12:44:48.108436 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 12 12:44:48.179139 master-0 kubenswrapper[13984]: I0312 12:44:48.178732 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 12:44:48.197675 master-0 kubenswrapper[13984]: I0312 12:44:48.196763 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65a4af9d-8221-42be-b78f-ada6b347b337-kolla-config\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.197675 master-0 kubenswrapper[13984]: I0312 12:44:48.196874 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4af9d-8221-42be-b78f-ada6b347b337-combined-ca-bundle\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.197675 master-0 kubenswrapper[13984]: I0312 12:44:48.197222 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4af9d-8221-42be-b78f-ada6b347b337-memcached-tls-certs\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.197675 master-0 kubenswrapper[13984]: I0312 12:44:48.197250 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65a4af9d-8221-42be-b78f-ada6b347b337-config-data\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.197675 master-0 kubenswrapper[13984]: I0312 12:44:48.197300 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx25h\" (UniqueName: \"kubernetes.io/projected/65a4af9d-8221-42be-b78f-ada6b347b337-kube-api-access-nx25h\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.299493 master-0 kubenswrapper[13984]: I0312 12:44:48.299413 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65a4af9d-8221-42be-b78f-ada6b347b337-kolla-config\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.299801 master-0 kubenswrapper[13984]: I0312 12:44:48.299504 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4af9d-8221-42be-b78f-ada6b347b337-combined-ca-bundle\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.299801 master-0 kubenswrapper[13984]: I0312 12:44:48.299651 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4af9d-8221-42be-b78f-ada6b347b337-memcached-tls-certs\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.299952 master-0 kubenswrapper[13984]: I0312 12:44:48.299880 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65a4af9d-8221-42be-b78f-ada6b347b337-config-data\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.300016 master-0 kubenswrapper[13984]: I0312 12:44:48.299972 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx25h\" (UniqueName: \"kubernetes.io/projected/65a4af9d-8221-42be-b78f-ada6b347b337-kube-api-access-nx25h\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.300890 master-0 kubenswrapper[13984]: I0312 12:44:48.300664 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/65a4af9d-8221-42be-b78f-ada6b347b337-kolla-config\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.300890 master-0 kubenswrapper[13984]: I0312 12:44:48.300716 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65a4af9d-8221-42be-b78f-ada6b347b337-config-data\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.311622 master-0 kubenswrapper[13984]: I0312 12:44:48.303739 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a4af9d-8221-42be-b78f-ada6b347b337-combined-ca-bundle\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.311622 master-0 kubenswrapper[13984]: I0312 12:44:48.307880 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/65a4af9d-8221-42be-b78f-ada6b347b337-memcached-tls-certs\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.313540 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.315529 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.317747 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.318018 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.318803 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.320845 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.320975 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 12 12:44:48.329563 master-0 kubenswrapper[13984]: I0312 12:44:48.321401 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 12 12:44:48.335497 master-0 kubenswrapper[13984]: I0312 12:44:48.334267 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx25h\" (UniqueName: \"kubernetes.io/projected/65a4af9d-8221-42be-b78f-ada6b347b337-kube-api-access-nx25h\") pod \"memcached-0\" (UID: \"65a4af9d-8221-42be-b78f-ada6b347b337\") " pod="openstack/memcached-0" Mar 12 12:44:48.345511 master-0 kubenswrapper[13984]: I0312 12:44:48.342810 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.404974 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405045 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405082 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405114 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405153 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3e2e1199-95e4-4120-9b0a-1c026efe6cb4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^157d88c8-dc62-4447-9249-1e0f8b3d86a5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405173 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405190 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405221 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405292 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405336 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.405377 master-0 kubenswrapper[13984]: I0312 12:44:48.405371 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r9k5\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-kube-api-access-5r9k5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.409233 master-0 kubenswrapper[13984]: I0312 12:44:48.409068 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 12 12:44:48.508683 master-0 kubenswrapper[13984]: I0312 12:44:48.508547 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3e2e1199-95e4-4120-9b0a-1c026efe6cb4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^157d88c8-dc62-4447-9249-1e0f8b3d86a5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.510907 master-0 kubenswrapper[13984]: I0312 12:44:48.508625 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.510989 master-0 kubenswrapper[13984]: I0312 12:44:48.510921 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.510989 master-0 kubenswrapper[13984]: I0312 12:44:48.510984 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511133 master-0 kubenswrapper[13984]: I0312 12:44:48.511102 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511225 master-0 kubenswrapper[13984]: I0312 12:44:48.511197 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511282 master-0 kubenswrapper[13984]: I0312 12:44:48.511264 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r9k5\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-kube-api-access-5r9k5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511347 master-0 kubenswrapper[13984]: I0312 12:44:48.511325 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511384 master-0 kubenswrapper[13984]: I0312 12:44:48.511363 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511417 master-0 kubenswrapper[13984]: I0312 12:44:48.511398 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511450 master-0 kubenswrapper[13984]: I0312 12:44:48.511427 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511526 master-0 kubenswrapper[13984]: I0312 12:44:48.511500 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:44:48.511568 master-0 kubenswrapper[13984]: I0312 12:44:48.511545 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3e2e1199-95e4-4120-9b0a-1c026efe6cb4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^157d88c8-dc62-4447-9249-1e0f8b3d86a5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d3399cc3882400f53d01a517a777160b61f5e15fe9879e5dbd17d15a7d081e1f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.511629 master-0 kubenswrapper[13984]: I0312 12:44:48.511510 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.512350 master-0 kubenswrapper[13984]: I0312 12:44:48.512318 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.512485 master-0 kubenswrapper[13984]: I0312 12:44:48.512431 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.513816 master-0 kubenswrapper[13984]: I0312 12:44:48.513790 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.516380 master-0 kubenswrapper[13984]: I0312 12:44:48.514682 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.525562 master-0 kubenswrapper[13984]: I0312 12:44:48.523993 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.529147 master-0 kubenswrapper[13984]: I0312 12:44:48.525912 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.531703 master-0 kubenswrapper[13984]: I0312 12:44:48.530863 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r9k5\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-kube-api-access-5r9k5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.532444 master-0 kubenswrapper[13984]: I0312 12:44:48.532387 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:48.543560 master-0 kubenswrapper[13984]: I0312 12:44:48.543415 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:49.589584 master-0 kubenswrapper[13984]: I0312 12:44:49.588724 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 12:44:49.600212 master-0 kubenswrapper[13984]: I0312 12:44:49.600163 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.605849 master-0 kubenswrapper[13984]: I0312 12:44:49.605770 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 12 12:44:49.608907 master-0 kubenswrapper[13984]: I0312 12:44:49.608878 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 12 12:44:49.609094 master-0 kubenswrapper[13984]: I0312 12:44:49.609054 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 12 12:44:49.609275 master-0 kubenswrapper[13984]: I0312 12:44:49.609232 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 12 12:44:49.609356 master-0 kubenswrapper[13984]: I0312 12:44:49.609339 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 12 12:44:49.609472 master-0 kubenswrapper[13984]: I0312 12:44:49.609437 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 12 12:44:49.613647 master-0 kubenswrapper[13984]: I0312 12:44:49.613595 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639192 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639260 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-e6c62044-c29f-482a-b37a-3f0794a3beb8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9e4a1e73-5b90-4680-98ee-0cf5ac2441ee\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639299 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f328e3e3-f9b4-4a88-9883-694d89c182f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639345 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639470 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639517 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639555 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2ptr\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-kube-api-access-m2ptr\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639601 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f328e3e3-f9b4-4a88-9883-694d89c182f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639660 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639709 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.644549 master-0 kubenswrapper[13984]: I0312 12:44:49.639743 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.741219 master-0 kubenswrapper[13984]: I0312 12:44:49.741140 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.741219 master-0 kubenswrapper[13984]: I0312 12:44:49.741181 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-e6c62044-c29f-482a-b37a-3f0794a3beb8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9e4a1e73-5b90-4680-98ee-0cf5ac2441ee\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.741219 master-0 kubenswrapper[13984]: I0312 12:44:49.741209 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f328e3e3-f9b4-4a88-9883-694d89c182f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.741575 master-0 kubenswrapper[13984]: I0312 12:44:49.741240 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.741575 master-0 kubenswrapper[13984]: I0312 12:44:49.741263 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.742408 master-0 kubenswrapper[13984]: I0312 12:44:49.742354 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.742526 master-0 kubenswrapper[13984]: I0312 12:44:49.742415 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2ptr\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-kube-api-access-m2ptr\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.742526 master-0 kubenswrapper[13984]: I0312 12:44:49.742457 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f328e3e3-f9b4-4a88-9883-694d89c182f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.742720 master-0 kubenswrapper[13984]: I0312 12:44:49.742654 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.742784 master-0 kubenswrapper[13984]: I0312 12:44:49.742720 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.742784 master-0 kubenswrapper[13984]: I0312 12:44:49.742763 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.744615 master-0 kubenswrapper[13984]: I0312 12:44:49.744568 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.744875 master-0 kubenswrapper[13984]: I0312 12:44:49.744695 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.745425 master-0 kubenswrapper[13984]: I0312 12:44:49.745365 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-server-conf\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.751278 master-0 kubenswrapper[13984]: I0312 12:44:49.751229 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f328e3e3-f9b4-4a88-9883-694d89c182f7-pod-info\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.751278 master-0 kubenswrapper[13984]: I0312 12:44:49.751242 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.752161 master-0 kubenswrapper[13984]: I0312 12:44:49.752118 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.752295 master-0 kubenswrapper[13984]: I0312 12:44:49.752266 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:44:49.752346 master-0 kubenswrapper[13984]: I0312 12:44:49.752313 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-e6c62044-c29f-482a-b37a-3f0794a3beb8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9e4a1e73-5b90-4680-98ee-0cf5ac2441ee\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/2e8854ab83dbb49c67f243e240d5ccb5219ce69053493487b837bb0ec2f9b975/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.756654 master-0 kubenswrapper[13984]: I0312 12:44:49.756619 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.757180 master-0 kubenswrapper[13984]: I0312 12:44:49.757125 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f328e3e3-f9b4-4a88-9883-694d89c182f7-config-data\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.759984 master-0 kubenswrapper[13984]: I0312 12:44:49.759952 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2ptr\" (UniqueName: \"kubernetes.io/projected/f328e3e3-f9b4-4a88-9883-694d89c182f7-kube-api-access-m2ptr\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:49.763139 master-0 kubenswrapper[13984]: I0312 12:44:49.763079 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f328e3e3-f9b4-4a88-9883-694d89c182f7-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:50.149672 master-0 kubenswrapper[13984]: I0312 12:44:50.145290 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3e2e1199-95e4-4120-9b0a-1c026efe6cb4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^157d88c8-dc62-4447-9249-1e0f8b3d86a5\") pod \"rabbitmq-cell1-server-0\" (UID: \"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83\") " pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:50.197029 master-0 kubenswrapper[13984]: I0312 12:44:50.196975 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:44:50.211581 master-0 kubenswrapper[13984]: I0312 12:44:50.211533 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 12 12:44:50.219614 master-0 kubenswrapper[13984]: I0312 12:44:50.214763 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 12:44:50.219756 master-0 kubenswrapper[13984]: I0312 12:44:50.219622 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 12 12:44:50.219793 master-0 kubenswrapper[13984]: I0312 12:44:50.219730 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 12 12:44:50.219933 master-0 kubenswrapper[13984]: I0312 12:44:50.219743 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 12 12:44:50.239180 master-0 kubenswrapper[13984]: I0312 12:44:50.239099 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361094 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ea7532b5-e173-4190-a81d-a1e0d3bcd824-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361144 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-config-data-default\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361194 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm2xh\" (UniqueName: \"kubernetes.io/projected/ea7532b5-e173-4190-a81d-a1e0d3bcd824-kube-api-access-zm2xh\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361225 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7532b5-e173-4190-a81d-a1e0d3bcd824-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361260 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7532b5-e173-4190-a81d-a1e0d3bcd824-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361291 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-15722ae4-6eb6-4b78-951a-4fe6095b5f81\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4a86164d-74bd-407e-8e55-1587af4856bb\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361353 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.362543 master-0 kubenswrapper[13984]: I0312 12:44:50.361391 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-kolla-config\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.462957 master-0 kubenswrapper[13984]: I0312 12:44:50.462879 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-15722ae4-6eb6-4b78-951a-4fe6095b5f81\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4a86164d-74bd-407e-8e55-1587af4856bb\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.462996 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.463039 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-kolla-config\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.463075 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ea7532b5-e173-4190-a81d-a1e0d3bcd824-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.463093 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-config-data-default\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.463121 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm2xh\" (UniqueName: \"kubernetes.io/projected/ea7532b5-e173-4190-a81d-a1e0d3bcd824-kube-api-access-zm2xh\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.463138 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7532b5-e173-4190-a81d-a1e0d3bcd824-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.463197 master-0 kubenswrapper[13984]: I0312 12:44:50.463163 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7532b5-e173-4190-a81d-a1e0d3bcd824-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.464646 master-0 kubenswrapper[13984]: I0312 12:44:50.464620 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-config-data-default\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.465578 master-0 kubenswrapper[13984]: I0312 12:44:50.465532 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-kolla-config\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.465895 master-0 kubenswrapper[13984]: I0312 12:44:50.465854 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ea7532b5-e173-4190-a81d-a1e0d3bcd824-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.466459 master-0 kubenswrapper[13984]: I0312 12:44:50.466436 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:44:50.466562 master-0 kubenswrapper[13984]: I0312 12:44:50.466461 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-15722ae4-6eb6-4b78-951a-4fe6095b5f81\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4a86164d-74bd-407e-8e55-1587af4856bb\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/1ebf9761eeefc343aa059e4281cb513226044e1cec73d95337ff75563b383ef1/globalmount\"" pod="openstack/openstack-galera-0" Mar 12 12:44:50.468198 master-0 kubenswrapper[13984]: I0312 12:44:50.468094 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ea7532b5-e173-4190-a81d-a1e0d3bcd824-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.468198 master-0 kubenswrapper[13984]: I0312 12:44:50.468107 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea7532b5-e173-4190-a81d-a1e0d3bcd824-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.471426 master-0 kubenswrapper[13984]: I0312 12:44:50.471355 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea7532b5-e173-4190-a81d-a1e0d3bcd824-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:50.482497 master-0 kubenswrapper[13984]: I0312 12:44:50.482421 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm2xh\" (UniqueName: \"kubernetes.io/projected/ea7532b5-e173-4190-a81d-a1e0d3bcd824-kube-api-access-zm2xh\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:51.230591 master-0 kubenswrapper[13984]: I0312 12:44:51.230508 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 12:44:51.234116 master-0 kubenswrapper[13984]: I0312 12:44:51.232898 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.236161 master-0 kubenswrapper[13984]: I0312 12:44:51.236127 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 12 12:44:51.236534 master-0 kubenswrapper[13984]: I0312 12:44:51.236518 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 12 12:44:51.237709 master-0 kubenswrapper[13984]: I0312 12:44:51.236796 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 12 12:44:51.256324 master-0 kubenswrapper[13984]: I0312 12:44:51.256266 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 12:44:51.382343 master-0 kubenswrapper[13984]: I0312 12:44:51.380355 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2d6d3a7c-3bef-4e20-9e43-6ddce0928be8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c6b301-1ac3-45ae-b776-0d0f3fb484fb\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.383978 master-0 kubenswrapper[13984]: I0312 12:44:51.383943 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b76fd-5724-4d55-94b6-3c071262be24-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.384133 master-0 kubenswrapper[13984]: I0312 12:44:51.384118 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.384320 master-0 kubenswrapper[13984]: I0312 12:44:51.384303 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cb5b76fd-5724-4d55-94b6-3c071262be24-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.384419 master-0 kubenswrapper[13984]: I0312 12:44:51.384400 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hhkq\" (UniqueName: \"kubernetes.io/projected/cb5b76fd-5724-4d55-94b6-3c071262be24-kube-api-access-5hhkq\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.384558 master-0 kubenswrapper[13984]: I0312 12:44:51.384542 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.384672 master-0 kubenswrapper[13984]: I0312 12:44:51.384653 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.384847 master-0 kubenswrapper[13984]: I0312 12:44:51.384834 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5b76fd-5724-4d55-94b6-3c071262be24-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.487838 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.487958 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cb5b76fd-5724-4d55-94b6-3c071262be24-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.487982 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hhkq\" (UniqueName: \"kubernetes.io/projected/cb5b76fd-5724-4d55-94b6-3c071262be24-kube-api-access-5hhkq\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.488004 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.488032 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.488085 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5b76fd-5724-4d55-94b6-3c071262be24-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.488117 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2d6d3a7c-3bef-4e20-9e43-6ddce0928be8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c6b301-1ac3-45ae-b776-0d0f3fb484fb\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.488319 master-0 kubenswrapper[13984]: I0312 12:44:51.488249 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b76fd-5724-4d55-94b6-3c071262be24-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.489151 master-0 kubenswrapper[13984]: I0312 12:44:51.488551 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/cb5b76fd-5724-4d55-94b6-3c071262be24-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.489151 master-0 kubenswrapper[13984]: I0312 12:44:51.489122 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.495166 master-0 kubenswrapper[13984]: I0312 12:44:51.490239 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.495166 master-0 kubenswrapper[13984]: I0312 12:44:51.491291 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/cb5b76fd-5724-4d55-94b6-3c071262be24-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.495166 master-0 kubenswrapper[13984]: I0312 12:44:51.491670 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:44:51.495166 master-0 kubenswrapper[13984]: I0312 12:44:51.491689 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2d6d3a7c-3bef-4e20-9e43-6ddce0928be8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c6b301-1ac3-45ae-b776-0d0f3fb484fb\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/792f35c125e20965ffcced868f11ac6d17212450030f544ef5674d66442662eb/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.495166 master-0 kubenswrapper[13984]: I0312 12:44:51.493306 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb5b76fd-5724-4d55-94b6-3c071262be24-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.497377 master-0 kubenswrapper[13984]: I0312 12:44:51.497317 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb5b76fd-5724-4d55-94b6-3c071262be24-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.512726 master-0 kubenswrapper[13984]: I0312 12:44:51.512681 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hhkq\" (UniqueName: \"kubernetes.io/projected/cb5b76fd-5724-4d55-94b6-3c071262be24-kube-api-access-5hhkq\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:51.550686 master-0 kubenswrapper[13984]: I0312 12:44:51.550627 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-e6c62044-c29f-482a-b37a-3f0794a3beb8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^9e4a1e73-5b90-4680-98ee-0cf5ac2441ee\") pod \"rabbitmq-server-0\" (UID: \"f328e3e3-f9b4-4a88-9883-694d89c182f7\") " pod="openstack/rabbitmq-server-0" Mar 12 12:44:51.838309 master-0 kubenswrapper[13984]: I0312 12:44:51.838085 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 12 12:44:52.636328 master-0 kubenswrapper[13984]: I0312 12:44:52.636270 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-15722ae4-6eb6-4b78-951a-4fe6095b5f81\" (UniqueName: \"kubernetes.io/csi/topolvm.io^4a86164d-74bd-407e-8e55-1587af4856bb\") pod \"openstack-galera-0\" (UID: \"ea7532b5-e173-4190-a81d-a1e0d3bcd824\") " pod="openstack/openstack-galera-0" Mar 12 12:44:52.666969 master-0 kubenswrapper[13984]: I0312 12:44:52.661628 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 12 12:44:53.654717 master-0 kubenswrapper[13984]: I0312 12:44:53.653494 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2d6d3a7c-3bef-4e20-9e43-6ddce0928be8\" (UniqueName: \"kubernetes.io/csi/topolvm.io^c2c6b301-1ac3-45ae-b776-0d0f3fb484fb\") pod \"openstack-cell1-galera-0\" (UID: \"cb5b76fd-5724-4d55-94b6-3c071262be24\") " pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:53.692227 master-0 kubenswrapper[13984]: I0312 12:44:53.692170 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 12 12:44:54.151894 master-0 kubenswrapper[13984]: I0312 12:44:54.151832 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 12:44:54.153746 master-0 kubenswrapper[13984]: I0312 12:44:54.153711 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.162653 master-0 kubenswrapper[13984]: I0312 12:44:54.158812 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 12 12:44:54.162653 master-0 kubenswrapper[13984]: I0312 12:44:54.159206 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 12 12:44:54.162653 master-0 kubenswrapper[13984]: I0312 12:44:54.159403 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 12 12:44:54.162653 master-0 kubenswrapper[13984]: I0312 12:44:54.159592 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 12 12:44:54.170301 master-0 kubenswrapper[13984]: I0312 12:44:54.170243 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 12:44:54.244274 master-0 kubenswrapper[13984]: I0312 12:44:54.244198 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244514 master-0 kubenswrapper[13984]: I0312 12:44:54.244303 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-df26400d-77c8-4a4c-a1db-bd2b5a16979b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2a636f29-3568-44e1-81b2-91b545a43bfc\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244514 master-0 kubenswrapper[13984]: I0312 12:44:54.244395 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244621 master-0 kubenswrapper[13984]: I0312 12:44:54.244435 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-config\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244621 master-0 kubenswrapper[13984]: I0312 12:44:54.244572 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244621 master-0 kubenswrapper[13984]: I0312 12:44:54.244596 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244798 master-0 kubenswrapper[13984]: I0312 12:44:54.244686 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvmln\" (UniqueName: \"kubernetes.io/projected/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-kube-api-access-mvmln\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.244798 master-0 kubenswrapper[13984]: I0312 12:44:54.244732 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349575 master-0 kubenswrapper[13984]: I0312 12:44:54.349505 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349595 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-df26400d-77c8-4a4c-a1db-bd2b5a16979b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2a636f29-3568-44e1-81b2-91b545a43bfc\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349643 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349673 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-config\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349695 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349722 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349756 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvmln\" (UniqueName: \"kubernetes.io/projected/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-kube-api-access-mvmln\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.349852 master-0 kubenswrapper[13984]: I0312 12:44:54.349778 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.350909 master-0 kubenswrapper[13984]: I0312 12:44:54.350597 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-config\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.350909 master-0 kubenswrapper[13984]: I0312 12:44:54.350844 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.352299 master-0 kubenswrapper[13984]: I0312 12:44:54.352268 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.352910 master-0 kubenswrapper[13984]: I0312 12:44:54.352868 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:44:54.352994 master-0 kubenswrapper[13984]: I0312 12:44:54.352921 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-df26400d-77c8-4a4c-a1db-bd2b5a16979b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2a636f29-3568-44e1-81b2-91b545a43bfc\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/43f8c132a155d56df8489ad75fa7481b1083c11fd76f02cf3cbffc163abc8494/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.354436 master-0 kubenswrapper[13984]: I0312 12:44:54.354392 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.354845 master-0 kubenswrapper[13984]: I0312 12:44:54.354806 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.364365 master-0 kubenswrapper[13984]: I0312 12:44:54.363945 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.376113 master-0 kubenswrapper[13984]: I0312 12:44:54.376066 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvmln\" (UniqueName: \"kubernetes.io/projected/b6e60ab1-d624-4df2-a3e8-a9044c72edfc-kube-api-access-mvmln\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:54.809120 master-0 kubenswrapper[13984]: I0312 12:44:54.809059 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7dvg9"] Mar 12 12:44:54.819618 master-0 kubenswrapper[13984]: I0312 12:44:54.810790 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.831170 master-0 kubenswrapper[13984]: I0312 12:44:54.830722 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 12 12:44:54.832925 master-0 kubenswrapper[13984]: I0312 12:44:54.832885 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 12 12:44:54.854553 master-0 kubenswrapper[13984]: I0312 12:44:54.848157 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7dvg9"] Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.856361 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-vttwx"] Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.859352 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.860570 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-run-ovn\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.860651 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zcpd\" (UniqueName: \"kubernetes.io/projected/f5cd50ae-2194-4717-96f0-47b3d353c8b1-kube-api-access-4zcpd\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.860766 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-log-ovn\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.861237 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cd50ae-2194-4717-96f0-47b3d353c8b1-combined-ca-bundle\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.861345 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5cd50ae-2194-4717-96f0-47b3d353c8b1-ovn-controller-tls-certs\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.861374 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-run\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.862519 master-0 kubenswrapper[13984]: I0312 12:44:54.861569 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5cd50ae-2194-4717-96f0-47b3d353c8b1-scripts\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.899588 master-0 kubenswrapper[13984]: I0312 12:44:54.899529 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vttwx"] Mar 12 12:44:54.963194 master-0 kubenswrapper[13984]: I0312 12:44:54.963093 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-etc-ovs\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.963395 master-0 kubenswrapper[13984]: I0312 12:44:54.963356 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-lib\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.963395 master-0 kubenswrapper[13984]: I0312 12:44:54.963389 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/095ac834-8b0e-490f-b39f-c97135436fab-scripts\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.963603 master-0 kubenswrapper[13984]: I0312 12:44:54.963419 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-run-ovn\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.963603 master-0 kubenswrapper[13984]: I0312 12:44:54.963437 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zcpd\" (UniqueName: \"kubernetes.io/projected/f5cd50ae-2194-4717-96f0-47b3d353c8b1-kube-api-access-4zcpd\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.963603 master-0 kubenswrapper[13984]: I0312 12:44:54.963462 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk29w\" (UniqueName: \"kubernetes.io/projected/095ac834-8b0e-490f-b39f-c97135436fab-kube-api-access-dk29w\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.963794 master-0 kubenswrapper[13984]: I0312 12:44:54.963583 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-log-ovn\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.963794 master-0 kubenswrapper[13984]: I0312 12:44:54.963692 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-log\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.963794 master-0 kubenswrapper[13984]: I0312 12:44:54.963769 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cd50ae-2194-4717-96f0-47b3d353c8b1-combined-ca-bundle\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.963930 master-0 kubenswrapper[13984]: I0312 12:44:54.963816 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5cd50ae-2194-4717-96f0-47b3d353c8b1-ovn-controller-tls-certs\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.963930 master-0 kubenswrapper[13984]: I0312 12:44:54.963841 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-run\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.963930 master-0 kubenswrapper[13984]: I0312 12:44:54.963868 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-run\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:54.963930 master-0 kubenswrapper[13984]: I0312 12:44:54.963890 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-run-ovn\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.964108 master-0 kubenswrapper[13984]: I0312 12:44:54.963942 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5cd50ae-2194-4717-96f0-47b3d353c8b1-scripts\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.964108 master-0 kubenswrapper[13984]: I0312 12:44:54.963972 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-run\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.964108 master-0 kubenswrapper[13984]: I0312 12:44:54.964084 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f5cd50ae-2194-4717-96f0-47b3d353c8b1-var-log-ovn\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.968226 master-0 kubenswrapper[13984]: I0312 12:44:54.968164 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5cd50ae-2194-4717-96f0-47b3d353c8b1-ovn-controller-tls-certs\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.971111 master-0 kubenswrapper[13984]: I0312 12:44:54.971065 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5cd50ae-2194-4717-96f0-47b3d353c8b1-scripts\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.973871 master-0 kubenswrapper[13984]: I0312 12:44:54.973837 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5cd50ae-2194-4717-96f0-47b3d353c8b1-combined-ca-bundle\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:54.983914 master-0 kubenswrapper[13984]: I0312 12:44:54.983872 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zcpd\" (UniqueName: \"kubernetes.io/projected/f5cd50ae-2194-4717-96f0-47b3d353c8b1-kube-api-access-4zcpd\") pod \"ovn-controller-7dvg9\" (UID: \"f5cd50ae-2194-4717-96f0-47b3d353c8b1\") " pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:55.066206 master-0 kubenswrapper[13984]: I0312 12:44:55.066081 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-log\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.066416 master-0 kubenswrapper[13984]: I0312 12:44:55.066373 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-run\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.066459 master-0 kubenswrapper[13984]: I0312 12:44:55.066430 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-etc-ovs\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.066562 master-0 kubenswrapper[13984]: I0312 12:44:55.066534 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-lib\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.066604 master-0 kubenswrapper[13984]: I0312 12:44:55.066571 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/095ac834-8b0e-490f-b39f-c97135436fab-scripts\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.066638 master-0 kubenswrapper[13984]: I0312 12:44:55.066609 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk29w\" (UniqueName: \"kubernetes.io/projected/095ac834-8b0e-490f-b39f-c97135436fab-kube-api-access-dk29w\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.066638 master-0 kubenswrapper[13984]: I0312 12:44:55.066606 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-run\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.067126 master-0 kubenswrapper[13984]: I0312 12:44:55.066842 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-lib\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.067126 master-0 kubenswrapper[13984]: I0312 12:44:55.067028 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-etc-ovs\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.067218 master-0 kubenswrapper[13984]: I0312 12:44:55.067192 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/095ac834-8b0e-490f-b39f-c97135436fab-var-log\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.069413 master-0 kubenswrapper[13984]: I0312 12:44:55.069375 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/095ac834-8b0e-490f-b39f-c97135436fab-scripts\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.083118 master-0 kubenswrapper[13984]: I0312 12:44:55.083082 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk29w\" (UniqueName: \"kubernetes.io/projected/095ac834-8b0e-490f-b39f-c97135436fab-kube-api-access-dk29w\") pod \"ovn-controller-ovs-vttwx\" (UID: \"095ac834-8b0e-490f-b39f-c97135436fab\") " pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.153078 master-0 kubenswrapper[13984]: I0312 12:44:55.152979 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7dvg9" Mar 12 12:44:55.191698 master-0 kubenswrapper[13984]: I0312 12:44:55.191627 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:44:55.977507 master-0 kubenswrapper[13984]: I0312 12:44:55.973509 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-df26400d-77c8-4a4c-a1db-bd2b5a16979b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^2a636f29-3568-44e1-81b2-91b545a43bfc\") pod \"ovsdbserver-nb-0\" (UID: \"b6e60ab1-d624-4df2-a3e8-a9044c72edfc\") " pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:56.021257 master-0 kubenswrapper[13984]: I0312 12:44:56.019217 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 12 12:44:58.329916 master-0 kubenswrapper[13984]: I0312 12:44:58.329787 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 12:44:58.331889 master-0 kubenswrapper[13984]: I0312 12:44:58.331843 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.334333 master-0 kubenswrapper[13984]: I0312 12:44:58.334293 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 12 12:44:58.334600 master-0 kubenswrapper[13984]: I0312 12:44:58.334571 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 12 12:44:58.334768 master-0 kubenswrapper[13984]: I0312 12:44:58.334742 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 12 12:44:58.397553 master-0 kubenswrapper[13984]: I0312 12:44:58.397242 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 12:44:58.688906 master-0 kubenswrapper[13984]: I0312 12:44:58.688445 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.688906 master-0 kubenswrapper[13984]: I0312 12:44:58.688762 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4c9f3c67-eed7-47fa-8955-c45f3b3be24c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82552d7b-586f-49e1-b299-7dc2a58bc8e7\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.688906 master-0 kubenswrapper[13984]: I0312 12:44:58.688849 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-config\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.691197 master-0 kubenswrapper[13984]: I0312 12:44:58.689878 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.691197 master-0 kubenswrapper[13984]: I0312 12:44:58.690084 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49zp5\" (UniqueName: \"kubernetes.io/projected/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-kube-api-access-49zp5\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.693816 master-0 kubenswrapper[13984]: I0312 12:44:58.693543 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.693879 master-0 kubenswrapper[13984]: I0312 12:44:58.693848 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.693919 master-0 kubenswrapper[13984]: I0312 12:44:58.693896 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.794982 master-0 kubenswrapper[13984]: I0312 12:44:58.794930 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795201 master-0 kubenswrapper[13984]: I0312 12:44:58.795044 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4c9f3c67-eed7-47fa-8955-c45f3b3be24c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82552d7b-586f-49e1-b299-7dc2a58bc8e7\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795201 master-0 kubenswrapper[13984]: I0312 12:44:58.795074 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-config\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795201 master-0 kubenswrapper[13984]: I0312 12:44:58.795117 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795201 master-0 kubenswrapper[13984]: I0312 12:44:58.795157 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49zp5\" (UniqueName: \"kubernetes.io/projected/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-kube-api-access-49zp5\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795328 master-0 kubenswrapper[13984]: I0312 12:44:58.795201 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795328 master-0 kubenswrapper[13984]: I0312 12:44:58.795268 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795328 master-0 kubenswrapper[13984]: I0312 12:44:58.795302 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.795983 master-0 kubenswrapper[13984]: I0312 12:44:58.795950 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.796710 master-0 kubenswrapper[13984]: I0312 12:44:58.796682 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-config\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.796710 master-0 kubenswrapper[13984]: I0312 12:44:58.796701 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.797570 master-0 kubenswrapper[13984]: I0312 12:44:58.797551 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:44:58.797634 master-0 kubenswrapper[13984]: I0312 12:44:58.797577 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4c9f3c67-eed7-47fa-8955-c45f3b3be24c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82552d7b-586f-49e1-b299-7dc2a58bc8e7\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/d0c5457edac9b95e67d7cd6e47327e1525b5765769114116c54bee35c89148da/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.799145 master-0 kubenswrapper[13984]: I0312 12:44:58.799111 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.801508 master-0 kubenswrapper[13984]: I0312 12:44:58.801455 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.802855 master-0 kubenswrapper[13984]: I0312 12:44:58.802824 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:44:58.820934 master-0 kubenswrapper[13984]: I0312 12:44:58.816053 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49zp5\" (UniqueName: \"kubernetes.io/projected/cb8736d5-b5e7-4ab6-9755-d295836ae7a4-kube-api-access-49zp5\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:00.358703 master-0 kubenswrapper[13984]: I0312 12:45:00.358635 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4c9f3c67-eed7-47fa-8955-c45f3b3be24c\" (UniqueName: \"kubernetes.io/csi/topolvm.io^82552d7b-586f-49e1-b299-7dc2a58bc8e7\") pod \"ovsdbserver-sb-0\" (UID: \"cb8736d5-b5e7-4ab6-9755-d295836ae7a4\") " pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:00.522613 master-0 kubenswrapper[13984]: I0312 12:45:00.522459 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:01.619341 master-0 kubenswrapper[13984]: I0312 12:45:01.616202 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 12 12:45:01.664789 master-0 kubenswrapper[13984]: W0312 12:45:01.661166 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae1ce187_98f0_4dc9_ba86_e18b4cfe6a83.slice/crio-d2832fbd0d76a0f7b7d17ad55f7bc4820783c33d0c7f59ded59058df61d716e9 WatchSource:0}: Error finding container d2832fbd0d76a0f7b7d17ad55f7bc4820783c33d0c7f59ded59058df61d716e9: Status 404 returned error can't find the container with id d2832fbd0d76a0f7b7d17ad55f7bc4820783c33d0c7f59ded59058df61d716e9 Mar 12 12:45:01.848865 master-0 kubenswrapper[13984]: I0312 12:45:01.848810 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8"] Mar 12 12:45:02.036523 master-0 kubenswrapper[13984]: I0312 12:45:02.036459 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" event={"ID":"75d275ee-71d3-4fef-944d-0a04b037a6fd","Type":"ContainerStarted","Data":"2cc3e64b3fd2fcbd744d3dd58db0c9762ad0f1dba911638f9ba255824155ba83"} Mar 12 12:45:02.038068 master-0 kubenswrapper[13984]: I0312 12:45:02.038041 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83","Type":"ContainerStarted","Data":"d2832fbd0d76a0f7b7d17ad55f7bc4820783c33d0c7f59ded59058df61d716e9"} Mar 12 12:45:02.040323 master-0 kubenswrapper[13984]: I0312 12:45:02.040285 13984 generic.go:334] "Generic (PLEG): container finished" podID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerID="e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979" exitCode=0 Mar 12 12:45:02.040404 master-0 kubenswrapper[13984]: I0312 12:45:02.040357 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-x82h2" event={"ID":"4c532467-afe9-4d74-a5e0-f8f6ac941467","Type":"ContainerDied","Data":"e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979"} Mar 12 12:45:02.042271 master-0 kubenswrapper[13984]: I0312 12:45:02.042243 13984 generic.go:334] "Generic (PLEG): container finished" podID="cd429f0b-37cd-4642-b918-2ec7640f17a5" containerID="8a2bcd43b861a6010ab3cbf3cdfa1691a12a715a149b64b5b0a1f8f7ddd4a026" exitCode=0 Mar 12 12:45:02.042379 master-0 kubenswrapper[13984]: I0312 12:45:02.042345 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" event={"ID":"cd429f0b-37cd-4642-b918-2ec7640f17a5","Type":"ContainerDied","Data":"8a2bcd43b861a6010ab3cbf3cdfa1691a12a715a149b64b5b0a1f8f7ddd4a026"} Mar 12 12:45:02.048067 master-0 kubenswrapper[13984]: I0312 12:45:02.047785 13984 generic.go:334] "Generic (PLEG): container finished" podID="7f7e5749-dab3-4f25-98f8-be2a568cdc5d" containerID="ea8c84dab7601c0d0ba0b91fabbda0ab777f79d021ad133d5aa6e12d295e2ef1" exitCode=0 Mar 12 12:45:02.048067 master-0 kubenswrapper[13984]: I0312 12:45:02.047843 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" event={"ID":"7f7e5749-dab3-4f25-98f8-be2a568cdc5d","Type":"ContainerDied","Data":"ea8c84dab7601c0d0ba0b91fabbda0ab777f79d021ad133d5aa6e12d295e2ef1"} Mar 12 12:45:02.339927 master-0 kubenswrapper[13984]: I0312 12:45:02.336932 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7dvg9"] Mar 12 12:45:02.357827 master-0 kubenswrapper[13984]: I0312 12:45:02.357756 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 12 12:45:02.394317 master-0 kubenswrapper[13984]: I0312 12:45:02.394231 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 12 12:45:02.403672 master-0 kubenswrapper[13984]: I0312 12:45:02.403559 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 12 12:45:02.644609 master-0 kubenswrapper[13984]: E0312 12:45:02.644434 13984 log.go:32] "CreateContainer in sandbox from runtime service failed" err=< Mar 12 12:45:02.644609 master-0 kubenswrapper[13984]: rpc error: code = Unknown desc = container create failed: mount `/var/lib/kubelet/pods/4c532467-afe9-4d74-a5e0-f8f6ac941467/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 12 12:45:02.644609 master-0 kubenswrapper[13984]: > podSandboxID="b8722c033eb2100d9f64f3ae9edb64a595c9c0db134e7252b0b04af6cdecd689" Mar 12 12:45:02.645563 master-0 kubenswrapper[13984]: E0312 12:45:02.645086 13984 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 12 12:45:02.645563 master-0 kubenswrapper[13984]: container &Container{Name:dnsmasq-dns,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:fbb5be29e9e4fa11f0743e7f74f2e80dcc7445d24770709ea0e038147f752c51,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d7h64dhb8hb8h587h59ch664h5c7h56dh67ch657h657h5fbh5chd8h9hcfh645h594h59ch565h669h648h5d5h8ch597h58bhd5h6fh67dh589hd4q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dqjd4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 5353 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000800000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-76849d6659-x82h2_openstack(4c532467-afe9-4d74-a5e0-f8f6ac941467): CreateContainerError: container create failed: mount `/var/lib/kubelet/pods/4c532467-afe9-4d74-a5e0-f8f6ac941467/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory Mar 12 12:45:02.645563 master-0 kubenswrapper[13984]: > logger="UnhandledError" Mar 12 12:45:02.647182 master-0 kubenswrapper[13984]: E0312 12:45:02.647133 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dnsmasq-dns\" with CreateContainerError: \"container create failed: mount `/var/lib/kubelet/pods/4c532467-afe9-4d74-a5e0-f8f6ac941467/volume-subpaths/dns-svc/dnsmasq-dns/1` to `etc/dnsmasq.d/hosts/dns-svc`: No such file or directory\\n\"" pod="openstack/dnsmasq-dns-76849d6659-x82h2" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" Mar 12 12:45:02.908214 master-0 kubenswrapper[13984]: I0312 12:45:02.908122 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 12 12:45:02.948457 master-0 kubenswrapper[13984]: W0312 12:45:02.948202 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb5b76fd_5724_4d55_94b6_3c071262be24.slice/crio-f169dc829c08fe762fc7449608f07d54374d0c49e0bf95afb2fa68ed2ef9bd66 WatchSource:0}: Error finding container f169dc829c08fe762fc7449608f07d54374d0c49e0bf95afb2fa68ed2ef9bd66: Status 404 returned error can't find the container with id f169dc829c08fe762fc7449608f07d54374d0c49e0bf95afb2fa68ed2ef9bd66 Mar 12 12:45:03.006671 master-0 kubenswrapper[13984]: I0312 12:45:03.006102 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:45:03.031304 master-0 kubenswrapper[13984]: I0312 12:45:03.031253 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:45:03.071271 master-0 kubenswrapper[13984]: I0312 12:45:03.071185 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb5b76fd-5724-4d55-94b6-3c071262be24","Type":"ContainerStarted","Data":"f169dc829c08fe762fc7449608f07d54374d0c49e0bf95afb2fa68ed2ef9bd66"} Mar 12 12:45:03.074879 master-0 kubenswrapper[13984]: I0312 12:45:03.074829 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" event={"ID":"cd429f0b-37cd-4642-b918-2ec7640f17a5","Type":"ContainerDied","Data":"5e7d73da0befcf725a590af930f149d9dca7faa70a9a677407b1e56419111b1a"} Mar 12 12:45:03.075021 master-0 kubenswrapper[13984]: I0312 12:45:03.074886 13984 scope.go:117] "RemoveContainer" containerID="8a2bcd43b861a6010ab3cbf3cdfa1691a12a715a149b64b5b0a1f8f7ddd4a026" Mar 12 12:45:03.075021 master-0 kubenswrapper[13984]: I0312 12:45:03.074997 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-685c76cf85-cs6gb" Mar 12 12:45:03.078535 master-0 kubenswrapper[13984]: I0312 12:45:03.077917 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" Mar 12 12:45:03.078535 master-0 kubenswrapper[13984]: I0312 12:45:03.078041 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8476fd89bc-5dqn7" event={"ID":"7f7e5749-dab3-4f25-98f8-be2a568cdc5d","Type":"ContainerDied","Data":"ee964eb61355e2c08a5ae7acc5a18af5bd22dc5919700fb29539a6b0ef7524ec"} Mar 12 12:45:03.082385 master-0 kubenswrapper[13984]: I0312 12:45:03.082339 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f328e3e3-f9b4-4a88-9883-694d89c182f7","Type":"ContainerStarted","Data":"92f2af60bf6887ff9489a5511947f452c074eb210f1fba51f4512441e6969063"} Mar 12 12:45:03.084368 master-0 kubenswrapper[13984]: I0312 12:45:03.084321 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7dvg9" event={"ID":"f5cd50ae-2194-4717-96f0-47b3d353c8b1","Type":"ContainerStarted","Data":"8a142c622492c0430991f368ea3e31ffc6aa28d1c6df88d70dab128b4a34a9be"} Mar 12 12:45:03.089176 master-0 kubenswrapper[13984]: I0312 12:45:03.089133 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"65a4af9d-8221-42be-b78f-ada6b347b337","Type":"ContainerStarted","Data":"9cf24c1797a465447f9222d2dbd431d0c0ac63cf29aefeb34e7291b85d1b9bac"} Mar 12 12:45:03.104952 master-0 kubenswrapper[13984]: I0312 12:45:03.104172 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ea7532b5-e173-4190-a81d-a1e0d3bcd824","Type":"ContainerStarted","Data":"0e4c5338728692d1af5871246d5f2aea12682aaab6ae9384cc436e1fbaf35e55"} Mar 12 12:45:03.116211 master-0 kubenswrapper[13984]: I0312 12:45:03.110757 13984 generic.go:334] "Generic (PLEG): container finished" podID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerID="a64646f43a70f0c41cffd6ddfd16c907240575feffa319ef6386a30ba0787f3e" exitCode=0 Mar 12 12:45:03.116211 master-0 kubenswrapper[13984]: I0312 12:45:03.110899 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" event={"ID":"75d275ee-71d3-4fef-944d-0a04b037a6fd","Type":"ContainerDied","Data":"a64646f43a70f0c41cffd6ddfd16c907240575feffa319ef6386a30ba0787f3e"} Mar 12 12:45:03.116837 master-0 kubenswrapper[13984]: I0312 12:45:03.116806 13984 scope.go:117] "RemoveContainer" containerID="ea8c84dab7601c0d0ba0b91fabbda0ab777f79d021ad133d5aa6e12d295e2ef1" Mar 12 12:45:03.207751 master-0 kubenswrapper[13984]: I0312 12:45:03.207699 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffrln\" (UniqueName: \"kubernetes.io/projected/cd429f0b-37cd-4642-b918-2ec7640f17a5-kube-api-access-ffrln\") pod \"cd429f0b-37cd-4642-b918-2ec7640f17a5\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " Mar 12 12:45:03.207969 master-0 kubenswrapper[13984]: I0312 12:45:03.207826 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-dns-svc\") pod \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " Mar 12 12:45:03.207969 master-0 kubenswrapper[13984]: I0312 12:45:03.207897 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-config\") pod \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " Mar 12 12:45:03.208062 master-0 kubenswrapper[13984]: I0312 12:45:03.207973 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd429f0b-37cd-4642-b918-2ec7640f17a5-config\") pod \"cd429f0b-37cd-4642-b918-2ec7640f17a5\" (UID: \"cd429f0b-37cd-4642-b918-2ec7640f17a5\") " Mar 12 12:45:03.208109 master-0 kubenswrapper[13984]: I0312 12:45:03.208064 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqzlr\" (UniqueName: \"kubernetes.io/projected/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-kube-api-access-pqzlr\") pod \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\" (UID: \"7f7e5749-dab3-4f25-98f8-be2a568cdc5d\") " Mar 12 12:45:03.213118 master-0 kubenswrapper[13984]: I0312 12:45:03.213038 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-kube-api-access-pqzlr" (OuterVolumeSpecName: "kube-api-access-pqzlr") pod "7f7e5749-dab3-4f25-98f8-be2a568cdc5d" (UID: "7f7e5749-dab3-4f25-98f8-be2a568cdc5d"). InnerVolumeSpecName "kube-api-access-pqzlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:03.213361 master-0 kubenswrapper[13984]: I0312 12:45:03.213136 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd429f0b-37cd-4642-b918-2ec7640f17a5-kube-api-access-ffrln" (OuterVolumeSpecName: "kube-api-access-ffrln") pod "cd429f0b-37cd-4642-b918-2ec7640f17a5" (UID: "cd429f0b-37cd-4642-b918-2ec7640f17a5"). InnerVolumeSpecName "kube-api-access-ffrln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:03.229068 master-0 kubenswrapper[13984]: I0312 12:45:03.229010 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-config" (OuterVolumeSpecName: "config") pod "7f7e5749-dab3-4f25-98f8-be2a568cdc5d" (UID: "7f7e5749-dab3-4f25-98f8-be2a568cdc5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:03.262503 master-0 kubenswrapper[13984]: I0312 12:45:03.262348 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd429f0b-37cd-4642-b918-2ec7640f17a5-config" (OuterVolumeSpecName: "config") pod "cd429f0b-37cd-4642-b918-2ec7640f17a5" (UID: "cd429f0b-37cd-4642-b918-2ec7640f17a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:03.265186 master-0 kubenswrapper[13984]: I0312 12:45:03.265036 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f7e5749-dab3-4f25-98f8-be2a568cdc5d" (UID: "7f7e5749-dab3-4f25-98f8-be2a568cdc5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:03.274773 master-0 kubenswrapper[13984]: I0312 12:45:03.274401 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-vttwx"] Mar 12 12:45:03.330714 master-0 kubenswrapper[13984]: I0312 12:45:03.319276 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqzlr\" (UniqueName: \"kubernetes.io/projected/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-kube-api-access-pqzlr\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:03.330908 master-0 kubenswrapper[13984]: I0312 12:45:03.330749 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffrln\" (UniqueName: \"kubernetes.io/projected/cd429f0b-37cd-4642-b918-2ec7640f17a5-kube-api-access-ffrln\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:03.330908 master-0 kubenswrapper[13984]: I0312 12:45:03.330781 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:03.330908 master-0 kubenswrapper[13984]: I0312 12:45:03.330844 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f7e5749-dab3-4f25-98f8-be2a568cdc5d-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:03.330908 master-0 kubenswrapper[13984]: I0312 12:45:03.330859 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd429f0b-37cd-4642-b918-2ec7640f17a5-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:03.460265 master-0 kubenswrapper[13984]: I0312 12:45:03.460199 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cs6gb"] Mar 12 12:45:03.470747 master-0 kubenswrapper[13984]: I0312 12:45:03.470452 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-685c76cf85-cs6gb"] Mar 12 12:45:03.591764 master-0 kubenswrapper[13984]: I0312 12:45:03.591713 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-5dqn7"] Mar 12 12:45:03.649139 master-0 kubenswrapper[13984]: I0312 12:45:03.649072 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8476fd89bc-5dqn7"] Mar 12 12:45:03.846525 master-0 kubenswrapper[13984]: I0312 12:45:03.840558 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 12 12:45:03.850334 master-0 kubenswrapper[13984]: W0312 12:45:03.849327 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6e60ab1_d624_4df2_a3e8_a9044c72edfc.slice/crio-74f83f43321e67a1867e1a047721d096cfa9a3670ff028b05d568dd7b0bb72e2 WatchSource:0}: Error finding container 74f83f43321e67a1867e1a047721d096cfa9a3670ff028b05d568dd7b0bb72e2: Status 404 returned error can't find the container with id 74f83f43321e67a1867e1a047721d096cfa9a3670ff028b05d568dd7b0bb72e2 Mar 12 12:45:03.992157 master-0 kubenswrapper[13984]: I0312 12:45:03.992065 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f7e5749-dab3-4f25-98f8-be2a568cdc5d" path="/var/lib/kubelet/pods/7f7e5749-dab3-4f25-98f8-be2a568cdc5d/volumes" Mar 12 12:45:03.992693 master-0 kubenswrapper[13984]: I0312 12:45:03.992664 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd429f0b-37cd-4642-b918-2ec7640f17a5" path="/var/lib/kubelet/pods/cd429f0b-37cd-4642-b918-2ec7640f17a5/volumes" Mar 12 12:45:04.133830 master-0 kubenswrapper[13984]: I0312 12:45:04.133710 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vttwx" event={"ID":"095ac834-8b0e-490f-b39f-c97135436fab","Type":"ContainerStarted","Data":"c285c15a628b5049a2231def1d00070bca5b7f55a8e48503ea14e69ce4a91a9c"} Mar 12 12:45:04.137050 master-0 kubenswrapper[13984]: I0312 12:45:04.137026 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" event={"ID":"75d275ee-71d3-4fef-944d-0a04b037a6fd","Type":"ContainerStarted","Data":"2a9f4455daaafc256a1c6aa201dcf9b569c74b94387bddc24f467e3bbaa9fdd8"} Mar 12 12:45:04.137659 master-0 kubenswrapper[13984]: I0312 12:45:04.137604 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:45:04.152588 master-0 kubenswrapper[13984]: I0312 12:45:04.152442 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-x82h2" event={"ID":"4c532467-afe9-4d74-a5e0-f8f6ac941467","Type":"ContainerStarted","Data":"edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea"} Mar 12 12:45:04.152803 master-0 kubenswrapper[13984]: I0312 12:45:04.152730 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:45:04.182499 master-0 kubenswrapper[13984]: I0312 12:45:04.163099 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b6e60ab1-d624-4df2-a3e8-a9044c72edfc","Type":"ContainerStarted","Data":"74f83f43321e67a1867e1a047721d096cfa9a3670ff028b05d568dd7b0bb72e2"} Mar 12 12:45:04.182499 master-0 kubenswrapper[13984]: I0312 12:45:04.178027 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" podStartSLOduration=20.177923646 podStartE2EDuration="20.177923646s" podCreationTimestamp="2026-03-12 12:44:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:04.164352077 +0000 UTC m=+1236.362367559" watchObservedRunningTime="2026-03-12 12:45:04.177923646 +0000 UTC m=+1236.375939138" Mar 12 12:45:04.206694 master-0 kubenswrapper[13984]: I0312 12:45:04.206553 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76849d6659-x82h2" podStartSLOduration=5.546355776 podStartE2EDuration="21.206528133s" podCreationTimestamp="2026-03-12 12:44:43 +0000 UTC" firstStartedPulling="2026-03-12 12:44:45.7232907 +0000 UTC m=+1217.921306262" lastFinishedPulling="2026-03-12 12:45:01.383463127 +0000 UTC m=+1233.581478619" observedRunningTime="2026-03-12 12:45:04.189288985 +0000 UTC m=+1236.387304487" watchObservedRunningTime="2026-03-12 12:45:04.206528133 +0000 UTC m=+1236.404543625" Mar 12 12:45:04.233929 master-0 kubenswrapper[13984]: I0312 12:45:04.223075 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 12 12:45:05.178496 master-0 kubenswrapper[13984]: I0312 12:45:05.178434 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb8736d5-b5e7-4ab6-9755-d295836ae7a4","Type":"ContainerStarted","Data":"5791053b07a30a108acf112fb566df09387b37d361728bc11bcfb1f4efbaa140"} Mar 12 12:45:06.759292 master-0 kubenswrapper[13984]: I0312 12:45:06.759239 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-bjdfk"] Mar 12 12:45:06.760562 master-0 kubenswrapper[13984]: E0312 12:45:06.760536 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f7e5749-dab3-4f25-98f8-be2a568cdc5d" containerName="init" Mar 12 12:45:06.760764 master-0 kubenswrapper[13984]: I0312 12:45:06.760749 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f7e5749-dab3-4f25-98f8-be2a568cdc5d" containerName="init" Mar 12 12:45:06.760887 master-0 kubenswrapper[13984]: E0312 12:45:06.760870 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd429f0b-37cd-4642-b918-2ec7640f17a5" containerName="init" Mar 12 12:45:06.760977 master-0 kubenswrapper[13984]: I0312 12:45:06.760962 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd429f0b-37cd-4642-b918-2ec7640f17a5" containerName="init" Mar 12 12:45:06.761447 master-0 kubenswrapper[13984]: I0312 12:45:06.761429 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f7e5749-dab3-4f25-98f8-be2a568cdc5d" containerName="init" Mar 12 12:45:06.761677 master-0 kubenswrapper[13984]: I0312 12:45:06.761662 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd429f0b-37cd-4642-b918-2ec7640f17a5" containerName="init" Mar 12 12:45:06.762877 master-0 kubenswrapper[13984]: I0312 12:45:06.762845 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:06.779562 master-0 kubenswrapper[13984]: I0312 12:45:06.774150 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 12 12:45:06.824102 master-0 kubenswrapper[13984]: I0312 12:45:06.820268 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bjdfk"] Mar 12 12:45:06.931824 master-0 kubenswrapper[13984]: I0312 12:45:06.931439 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95386842-d198-447d-8b98-f74826151e8b-ovs-rundir\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:06.931824 master-0 kubenswrapper[13984]: I0312 12:45:06.931528 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95386842-d198-447d-8b98-f74826151e8b-config\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:06.931824 master-0 kubenswrapper[13984]: I0312 12:45:06.931606 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpd7b\" (UniqueName: \"kubernetes.io/projected/95386842-d198-447d-8b98-f74826151e8b-kube-api-access-wpd7b\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:06.931824 master-0 kubenswrapper[13984]: I0312 12:45:06.931669 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95386842-d198-447d-8b98-f74826151e8b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:06.931824 master-0 kubenswrapper[13984]: I0312 12:45:06.931701 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95386842-d198-447d-8b98-f74826151e8b-ovn-rundir\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:06.931824 master-0 kubenswrapper[13984]: I0312 12:45:06.931762 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95386842-d198-447d-8b98-f74826151e8b-combined-ca-bundle\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.035094 master-0 kubenswrapper[13984]: I0312 12:45:07.034628 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95386842-d198-447d-8b98-f74826151e8b-ovn-rundir\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.035094 master-0 kubenswrapper[13984]: I0312 12:45:07.034838 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/95386842-d198-447d-8b98-f74826151e8b-ovn-rundir\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.035094 master-0 kubenswrapper[13984]: I0312 12:45:07.034941 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95386842-d198-447d-8b98-f74826151e8b-combined-ca-bundle\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.035094 master-0 kubenswrapper[13984]: I0312 12:45:07.035096 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95386842-d198-447d-8b98-f74826151e8b-ovs-rundir\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.035368 master-0 kubenswrapper[13984]: I0312 12:45:07.035119 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95386842-d198-447d-8b98-f74826151e8b-config\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.035368 master-0 kubenswrapper[13984]: I0312 12:45:07.035212 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/95386842-d198-447d-8b98-f74826151e8b-ovs-rundir\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.042138 master-0 kubenswrapper[13984]: I0312 12:45:07.037183 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95386842-d198-447d-8b98-f74826151e8b-config\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.044776 master-0 kubenswrapper[13984]: I0312 12:45:07.043741 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95386842-d198-447d-8b98-f74826151e8b-combined-ca-bundle\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.044776 master-0 kubenswrapper[13984]: I0312 12:45:07.043944 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpd7b\" (UniqueName: \"kubernetes.io/projected/95386842-d198-447d-8b98-f74826151e8b-kube-api-access-wpd7b\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.050953 master-0 kubenswrapper[13984]: I0312 12:45:07.050896 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95386842-d198-447d-8b98-f74826151e8b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.084631 master-0 kubenswrapper[13984]: I0312 12:45:07.059190 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-x82h2"] Mar 12 12:45:07.084631 master-0 kubenswrapper[13984]: I0312 12:45:07.059877 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76849d6659-x82h2" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerName="dnsmasq-dns" containerID="cri-o://edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea" gracePeriod=10 Mar 12 12:45:07.084631 master-0 kubenswrapper[13984]: I0312 12:45:07.063005 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95386842-d198-447d-8b98-f74826151e8b-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.128166 master-0 kubenswrapper[13984]: I0312 12:45:07.119987 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpd7b\" (UniqueName: \"kubernetes.io/projected/95386842-d198-447d-8b98-f74826151e8b-kube-api-access-wpd7b\") pod \"ovn-controller-metrics-bjdfk\" (UID: \"95386842-d198-447d-8b98-f74826151e8b\") " pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.166671 master-0 kubenswrapper[13984]: I0312 12:45:07.165582 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79d6ccc4b7-kchfg"] Mar 12 12:45:07.184592 master-0 kubenswrapper[13984]: I0312 12:45:07.170832 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.184592 master-0 kubenswrapper[13984]: I0312 12:45:07.184054 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 12 12:45:07.255666 master-0 kubenswrapper[13984]: I0312 12:45:07.255610 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d6ccc4b7-kchfg"] Mar 12 12:45:07.259206 master-0 kubenswrapper[13984]: I0312 12:45:07.258702 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr7bd\" (UniqueName: \"kubernetes.io/projected/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-kube-api-access-kr7bd\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.259206 master-0 kubenswrapper[13984]: I0312 12:45:07.258775 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-dns-svc\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.259206 master-0 kubenswrapper[13984]: I0312 12:45:07.259015 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-config\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.259206 master-0 kubenswrapper[13984]: I0312 12:45:07.259075 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-ovsdbserver-nb\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.361985 master-0 kubenswrapper[13984]: I0312 12:45:07.361932 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-config\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.362874 master-0 kubenswrapper[13984]: I0312 12:45:07.362845 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-ovsdbserver-nb\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.364499 master-0 kubenswrapper[13984]: I0312 12:45:07.364452 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-config\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.364655 master-0 kubenswrapper[13984]: I0312 12:45:07.364627 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr7bd\" (UniqueName: \"kubernetes.io/projected/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-kube-api-access-kr7bd\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.364773 master-0 kubenswrapper[13984]: I0312 12:45:07.364756 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-dns-svc\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.365794 master-0 kubenswrapper[13984]: I0312 12:45:07.365722 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-dns-svc\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.372401 master-0 kubenswrapper[13984]: I0312 12:45:07.368698 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-ovsdbserver-nb\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.400228 master-0 kubenswrapper[13984]: I0312 12:45:07.398596 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8"] Mar 12 12:45:07.400228 master-0 kubenswrapper[13984]: I0312 12:45:07.399683 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerName="dnsmasq-dns" containerID="cri-o://2a9f4455daaafc256a1c6aa201dcf9b569c74b94387bddc24f467e3bbaa9fdd8" gracePeriod=10 Mar 12 12:45:07.402871 master-0 kubenswrapper[13984]: I0312 12:45:07.402838 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr7bd\" (UniqueName: \"kubernetes.io/projected/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-kube-api-access-kr7bd\") pod \"dnsmasq-dns-79d6ccc4b7-kchfg\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.415590 master-0 kubenswrapper[13984]: I0312 12:45:07.415522 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76f498f559-b67fm"] Mar 12 12:45:07.417756 master-0 kubenswrapper[13984]: I0312 12:45:07.417701 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.421586 master-0 kubenswrapper[13984]: I0312 12:45:07.421452 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-bjdfk" Mar 12 12:45:07.422262 master-0 kubenswrapper[13984]: I0312 12:45:07.422173 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 12 12:45:07.484673 master-0 kubenswrapper[13984]: I0312 12:45:07.484011 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-sb\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.484673 master-0 kubenswrapper[13984]: I0312 12:45:07.484152 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-nb\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.484673 master-0 kubenswrapper[13984]: I0312 12:45:07.484228 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-dns-svc\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.484673 master-0 kubenswrapper[13984]: I0312 12:45:07.484407 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt7fr\" (UniqueName: \"kubernetes.io/projected/7f87409d-d6ed-4313-8daa-e92892584f21-kube-api-access-gt7fr\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.484673 master-0 kubenswrapper[13984]: I0312 12:45:07.484562 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-config\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.497807 master-0 kubenswrapper[13984]: I0312 12:45:07.497556 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-b67fm"] Mar 12 12:45:07.586542 master-0 kubenswrapper[13984]: I0312 12:45:07.586464 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-sb\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.586681 master-0 kubenswrapper[13984]: I0312 12:45:07.586605 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-nb\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.586681 master-0 kubenswrapper[13984]: I0312 12:45:07.586670 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-dns-svc\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.586787 master-0 kubenswrapper[13984]: I0312 12:45:07.586767 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt7fr\" (UniqueName: \"kubernetes.io/projected/7f87409d-d6ed-4313-8daa-e92892584f21-kube-api-access-gt7fr\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.586859 master-0 kubenswrapper[13984]: I0312 12:45:07.586842 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-config\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.588121 master-0 kubenswrapper[13984]: I0312 12:45:07.588096 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-config\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.588907 master-0 kubenswrapper[13984]: I0312 12:45:07.588570 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-nb\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.588907 master-0 kubenswrapper[13984]: I0312 12:45:07.588749 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-dns-svc\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.594469 master-0 kubenswrapper[13984]: I0312 12:45:07.592718 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-sb\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.641306 master-0 kubenswrapper[13984]: I0312 12:45:07.641245 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt7fr\" (UniqueName: \"kubernetes.io/projected/7f87409d-d6ed-4313-8daa-e92892584f21-kube-api-access-gt7fr\") pod \"dnsmasq-dns-76f498f559-b67fm\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.651297 master-0 kubenswrapper[13984]: I0312 12:45:07.651271 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:07.916749 master-0 kubenswrapper[13984]: I0312 12:45:07.916695 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:07.943805 master-0 kubenswrapper[13984]: I0312 12:45:07.943767 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:45:07.999404 master-0 kubenswrapper[13984]: I0312 12:45:07.999345 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-config\") pod \"4c532467-afe9-4d74-a5e0-f8f6ac941467\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " Mar 12 12:45:07.999614 master-0 kubenswrapper[13984]: I0312 12:45:07.999420 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-dns-svc\") pod \"4c532467-afe9-4d74-a5e0-f8f6ac941467\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " Mar 12 12:45:07.999614 master-0 kubenswrapper[13984]: I0312 12:45:07.999568 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dqjd4\" (UniqueName: \"kubernetes.io/projected/4c532467-afe9-4d74-a5e0-f8f6ac941467-kube-api-access-dqjd4\") pod \"4c532467-afe9-4d74-a5e0-f8f6ac941467\" (UID: \"4c532467-afe9-4d74-a5e0-f8f6ac941467\") " Mar 12 12:45:08.022059 master-0 kubenswrapper[13984]: I0312 12:45:08.021965 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c532467-afe9-4d74-a5e0-f8f6ac941467-kube-api-access-dqjd4" (OuterVolumeSpecName: "kube-api-access-dqjd4") pod "4c532467-afe9-4d74-a5e0-f8f6ac941467" (UID: "4c532467-afe9-4d74-a5e0-f8f6ac941467"). InnerVolumeSpecName "kube-api-access-dqjd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:08.116047 master-0 kubenswrapper[13984]: I0312 12:45:08.114232 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dqjd4\" (UniqueName: \"kubernetes.io/projected/4c532467-afe9-4d74-a5e0-f8f6ac941467-kube-api-access-dqjd4\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:08.226597 master-0 kubenswrapper[13984]: I0312 12:45:08.226147 13984 generic.go:334] "Generic (PLEG): container finished" podID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerID="2a9f4455daaafc256a1c6aa201dcf9b569c74b94387bddc24f467e3bbaa9fdd8" exitCode=0 Mar 12 12:45:08.226597 master-0 kubenswrapper[13984]: I0312 12:45:08.226310 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" event={"ID":"75d275ee-71d3-4fef-944d-0a04b037a6fd","Type":"ContainerDied","Data":"2a9f4455daaafc256a1c6aa201dcf9b569c74b94387bddc24f467e3bbaa9fdd8"} Mar 12 12:45:08.243508 master-0 kubenswrapper[13984]: I0312 12:45:08.241238 13984 generic.go:334] "Generic (PLEG): container finished" podID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerID="edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea" exitCode=0 Mar 12 12:45:08.243508 master-0 kubenswrapper[13984]: I0312 12:45:08.241310 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-x82h2" event={"ID":"4c532467-afe9-4d74-a5e0-f8f6ac941467","Type":"ContainerDied","Data":"edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea"} Mar 12 12:45:08.243508 master-0 kubenswrapper[13984]: I0312 12:45:08.241340 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76849d6659-x82h2" event={"ID":"4c532467-afe9-4d74-a5e0-f8f6ac941467","Type":"ContainerDied","Data":"b8722c033eb2100d9f64f3ae9edb64a595c9c0db134e7252b0b04af6cdecd689"} Mar 12 12:45:08.243508 master-0 kubenswrapper[13984]: I0312 12:45:08.241357 13984 scope.go:117] "RemoveContainer" containerID="edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea" Mar 12 12:45:08.243508 master-0 kubenswrapper[13984]: I0312 12:45:08.241518 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76849d6659-x82h2" Mar 12 12:45:08.246636 master-0 kubenswrapper[13984]: I0312 12:45:08.245036 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"65a4af9d-8221-42be-b78f-ada6b347b337","Type":"ContainerStarted","Data":"3dddd4f14bdd454c0cc23a3f6b68974b5ddf857f9af57428038b48ad78c5d7fe"} Mar 12 12:45:08.246636 master-0 kubenswrapper[13984]: I0312 12:45:08.245634 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 12 12:45:08.319713 master-0 kubenswrapper[13984]: I0312 12:45:08.317941 13984 scope.go:117] "RemoveContainer" containerID="e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979" Mar 12 12:45:08.401557 master-0 kubenswrapper[13984]: I0312 12:45:08.401455 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-bjdfk"] Mar 12 12:45:08.456021 master-0 kubenswrapper[13984]: I0312 12:45:08.455674 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=15.558870326 podStartE2EDuration="20.455654757s" podCreationTimestamp="2026-03-12 12:44:48 +0000 UTC" firstStartedPulling="2026-03-12 12:45:02.339853417 +0000 UTC m=+1234.537868909" lastFinishedPulling="2026-03-12 12:45:07.236637848 +0000 UTC m=+1239.434653340" observedRunningTime="2026-03-12 12:45:08.383763402 +0000 UTC m=+1240.581778894" watchObservedRunningTime="2026-03-12 12:45:08.455654757 +0000 UTC m=+1240.653670269" Mar 12 12:45:08.494658 master-0 kubenswrapper[13984]: I0312 12:45:08.494589 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79d6ccc4b7-kchfg"] Mar 12 12:45:08.552330 master-0 kubenswrapper[13984]: I0312 12:45:08.552272 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c532467-afe9-4d74-a5e0-f8f6ac941467" (UID: "4c532467-afe9-4d74-a5e0-f8f6ac941467"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:08.583193 master-0 kubenswrapper[13984]: W0312 12:45:08.583140 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f6e7440_1cbe_4a21_a714_c21fe6f3b11d.slice/crio-485db5fb52663a379d50afce51ea53cdee8bd219fc86008daa2e777934d12d21 WatchSource:0}: Error finding container 485db5fb52663a379d50afce51ea53cdee8bd219fc86008daa2e777934d12d21: Status 404 returned error can't find the container with id 485db5fb52663a379d50afce51ea53cdee8bd219fc86008daa2e777934d12d21 Mar 12 12:45:08.651773 master-0 kubenswrapper[13984]: I0312 12:45:08.643796 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:08.693858 master-0 kubenswrapper[13984]: I0312 12:45:08.693813 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-config" (OuterVolumeSpecName: "config") pod "4c532467-afe9-4d74-a5e0-f8f6ac941467" (UID: "4c532467-afe9-4d74-a5e0-f8f6ac941467"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:08.714909 master-0 kubenswrapper[13984]: I0312 12:45:08.714863 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-b67fm"] Mar 12 12:45:08.747705 master-0 kubenswrapper[13984]: I0312 12:45:08.747645 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c532467-afe9-4d74-a5e0-f8f6ac941467-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:08.766933 master-0 kubenswrapper[13984]: I0312 12:45:08.766870 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:45:08.918615 master-0 kubenswrapper[13984]: I0312 12:45:08.915596 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-x82h2"] Mar 12 12:45:08.938732 master-0 kubenswrapper[13984]: I0312 12:45:08.937989 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76849d6659-x82h2"] Mar 12 12:45:08.964536 master-0 kubenswrapper[13984]: I0312 12:45:08.959061 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjtvk\" (UniqueName: \"kubernetes.io/projected/75d275ee-71d3-4fef-944d-0a04b037a6fd-kube-api-access-kjtvk\") pod \"75d275ee-71d3-4fef-944d-0a04b037a6fd\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " Mar 12 12:45:08.964536 master-0 kubenswrapper[13984]: I0312 12:45:08.959179 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-config\") pod \"75d275ee-71d3-4fef-944d-0a04b037a6fd\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " Mar 12 12:45:08.964536 master-0 kubenswrapper[13984]: I0312 12:45:08.959208 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-dns-svc\") pod \"75d275ee-71d3-4fef-944d-0a04b037a6fd\" (UID: \"75d275ee-71d3-4fef-944d-0a04b037a6fd\") " Mar 12 12:45:08.987512 master-0 kubenswrapper[13984]: I0312 12:45:08.986108 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75d275ee-71d3-4fef-944d-0a04b037a6fd-kube-api-access-kjtvk" (OuterVolumeSpecName: "kube-api-access-kjtvk") pod "75d275ee-71d3-4fef-944d-0a04b037a6fd" (UID: "75d275ee-71d3-4fef-944d-0a04b037a6fd"). InnerVolumeSpecName "kube-api-access-kjtvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:09.036512 master-0 kubenswrapper[13984]: I0312 12:45:09.035077 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "75d275ee-71d3-4fef-944d-0a04b037a6fd" (UID: "75d275ee-71d3-4fef-944d-0a04b037a6fd"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:09.067547 master-0 kubenswrapper[13984]: I0312 12:45:09.066654 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjtvk\" (UniqueName: \"kubernetes.io/projected/75d275ee-71d3-4fef-944d-0a04b037a6fd-kube-api-access-kjtvk\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:09.067547 master-0 kubenswrapper[13984]: I0312 12:45:09.066690 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:09.106547 master-0 kubenswrapper[13984]: I0312 12:45:09.105142 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-config" (OuterVolumeSpecName: "config") pod "75d275ee-71d3-4fef-944d-0a04b037a6fd" (UID: "75d275ee-71d3-4fef-944d-0a04b037a6fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:09.176572 master-0 kubenswrapper[13984]: I0312 12:45:09.176418 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/75d275ee-71d3-4fef-944d-0a04b037a6fd-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:09.262929 master-0 kubenswrapper[13984]: I0312 12:45:09.262757 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" event={"ID":"75d275ee-71d3-4fef-944d-0a04b037a6fd","Type":"ContainerDied","Data":"2cc3e64b3fd2fcbd744d3dd58db0c9762ad0f1dba911638f9ba255824155ba83"} Mar 12 12:45:09.263137 master-0 kubenswrapper[13984]: I0312 12:45:09.262979 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8" Mar 12 12:45:09.265733 master-0 kubenswrapper[13984]: I0312 12:45:09.265649 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" event={"ID":"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d","Type":"ContainerStarted","Data":"485db5fb52663a379d50afce51ea53cdee8bd219fc86008daa2e777934d12d21"} Mar 12 12:45:09.269846 master-0 kubenswrapper[13984]: I0312 12:45:09.268445 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83","Type":"ContainerStarted","Data":"b54f88294904df46f0848fd01fac6018f1cdf4c4066102661c571b0f08ff4a6c"} Mar 12 12:45:09.273388 master-0 kubenswrapper[13984]: I0312 12:45:09.273164 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bjdfk" event={"ID":"95386842-d198-447d-8b98-f74826151e8b","Type":"ContainerStarted","Data":"8798799b2e1754f6fc14ba757eec5264d7bc4331ddaaf679815ad2ca8ec5a03f"} Mar 12 12:45:09.308826 master-0 kubenswrapper[13984]: I0312 12:45:09.308757 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f328e3e3-f9b4-4a88-9883-694d89c182f7","Type":"ContainerStarted","Data":"733e368e6e05be7e45d1f385392d95b4c6f01e221666f2ea77cca3a0fafa2157"} Mar 12 12:45:09.437237 master-0 kubenswrapper[13984]: I0312 12:45:09.431462 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8"] Mar 12 12:45:09.451692 master-0 kubenswrapper[13984]: I0312 12:45:09.451601 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ff8fd9d5c-rz4c8"] Mar 12 12:45:09.994217 master-0 kubenswrapper[13984]: I0312 12:45:09.994149 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" path="/var/lib/kubelet/pods/4c532467-afe9-4d74-a5e0-f8f6ac941467/volumes" Mar 12 12:45:09.995038 master-0 kubenswrapper[13984]: I0312 12:45:09.995013 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" path="/var/lib/kubelet/pods/75d275ee-71d3-4fef-944d-0a04b037a6fd/volumes" Mar 12 12:45:11.051435 master-0 kubenswrapper[13984]: W0312 12:45:11.051358 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f87409d_d6ed_4313_8daa_e92892584f21.slice/crio-b9eea651e3dfeb30fa9261fdb5966a86ee9b84562fdc08640a0f96b03129ac42 WatchSource:0}: Error finding container b9eea651e3dfeb30fa9261fdb5966a86ee9b84562fdc08640a0f96b03129ac42: Status 404 returned error can't find the container with id b9eea651e3dfeb30fa9261fdb5966a86ee9b84562fdc08640a0f96b03129ac42 Mar 12 12:45:11.058107 master-0 kubenswrapper[13984]: I0312 12:45:11.058077 13984 scope.go:117] "RemoveContainer" containerID="edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea" Mar 12 12:45:11.058368 master-0 kubenswrapper[13984]: E0312 12:45:11.058338 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea\": container with ID starting with edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea not found: ID does not exist" containerID="edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea" Mar 12 12:45:11.058411 master-0 kubenswrapper[13984]: I0312 12:45:11.058369 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea"} err="failed to get container status \"edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea\": rpc error: code = NotFound desc = could not find container \"edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea\": container with ID starting with edf4755602d07079b97393f59c8a040f6b7e908787dbfa62154ebd9076cf14ea not found: ID does not exist" Mar 12 12:45:11.058454 master-0 kubenswrapper[13984]: I0312 12:45:11.058389 13984 scope.go:117] "RemoveContainer" containerID="e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979" Mar 12 12:45:11.058852 master-0 kubenswrapper[13984]: E0312 12:45:11.058801 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979\": container with ID starting with e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979 not found: ID does not exist" containerID="e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979" Mar 12 12:45:11.058904 master-0 kubenswrapper[13984]: I0312 12:45:11.058865 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979"} err="failed to get container status \"e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979\": rpc error: code = NotFound desc = could not find container \"e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979\": container with ID starting with e9eb5dbdff853c026dc359addf49783819cf6410f0dd2e0d793ce0e1a4f35979 not found: ID does not exist" Mar 12 12:45:11.058904 master-0 kubenswrapper[13984]: I0312 12:45:11.058899 13984 scope.go:117] "RemoveContainer" containerID="2a9f4455daaafc256a1c6aa201dcf9b569c74b94387bddc24f467e3bbaa9fdd8" Mar 12 12:45:11.334152 master-0 kubenswrapper[13984]: I0312 12:45:11.334013 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-b67fm" event={"ID":"7f87409d-d6ed-4313-8daa-e92892584f21","Type":"ContainerStarted","Data":"b9eea651e3dfeb30fa9261fdb5966a86ee9b84562fdc08640a0f96b03129ac42"} Mar 12 12:45:13.207749 master-0 kubenswrapper[13984]: I0312 12:45:13.207662 13984 scope.go:117] "RemoveContainer" containerID="a64646f43a70f0c41cffd6ddfd16c907240575feffa319ef6386a30ba0787f3e" Mar 12 12:45:13.363542 master-0 kubenswrapper[13984]: I0312 12:45:13.360466 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" event={"ID":"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d","Type":"ContainerStarted","Data":"4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4"} Mar 12 12:45:13.410728 master-0 kubenswrapper[13984]: I0312 12:45:13.410684 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 12 12:45:14.374812 master-0 kubenswrapper[13984]: I0312 12:45:14.374200 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vttwx" event={"ID":"095ac834-8b0e-490f-b39f-c97135436fab","Type":"ContainerStarted","Data":"10923612a4b81395e95cc6408a5b08245c7a0837c668197e6432bb9f9ee4e010"} Mar 12 12:45:14.378388 master-0 kubenswrapper[13984]: I0312 12:45:14.378304 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-7dvg9" Mar 12 12:45:14.391374 master-0 kubenswrapper[13984]: I0312 12:45:14.391239 13984 generic.go:334] "Generic (PLEG): container finished" podID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerID="4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4" exitCode=0 Mar 12 12:45:14.391374 master-0 kubenswrapper[13984]: I0312 12:45:14.391321 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" event={"ID":"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d","Type":"ContainerDied","Data":"4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4"} Mar 12 12:45:14.396870 master-0 kubenswrapper[13984]: I0312 12:45:14.396820 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-b67fm" event={"ID":"7f87409d-d6ed-4313-8daa-e92892584f21","Type":"ContainerStarted","Data":"8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215"} Mar 12 12:45:14.414706 master-0 kubenswrapper[13984]: I0312 12:45:14.414647 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b6e60ab1-d624-4df2-a3e8-a9044c72edfc","Type":"ContainerStarted","Data":"4685800c8c08c48af35a45e1fea1a7ce1ccc3a48416c7f4c859007b6aca45a93"} Mar 12 12:45:14.483314 master-0 kubenswrapper[13984]: I0312 12:45:14.482999 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-7dvg9" podStartSLOduration=8.929654457 podStartE2EDuration="20.482978288s" podCreationTimestamp="2026-03-12 12:44:54 +0000 UTC" firstStartedPulling="2026-03-12 12:45:02.339999451 +0000 UTC m=+1234.538014943" lastFinishedPulling="2026-03-12 12:45:13.893323282 +0000 UTC m=+1246.091338774" observedRunningTime="2026-03-12 12:45:14.470890095 +0000 UTC m=+1246.668905617" watchObservedRunningTime="2026-03-12 12:45:14.482978288 +0000 UTC m=+1246.680993780" Mar 12 12:45:15.426865 master-0 kubenswrapper[13984]: I0312 12:45:15.426710 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" event={"ID":"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d","Type":"ContainerStarted","Data":"cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32"} Mar 12 12:45:15.427690 master-0 kubenswrapper[13984]: I0312 12:45:15.427657 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:15.429664 master-0 kubenswrapper[13984]: I0312 12:45:15.429622 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb5b76fd-5724-4d55-94b6-3c071262be24","Type":"ContainerStarted","Data":"13a16dcf10d3a770987049785c46248de3f5848dff290d09d9282c321a077dcc"} Mar 12 12:45:15.436733 master-0 kubenswrapper[13984]: I0312 12:45:15.434198 13984 generic.go:334] "Generic (PLEG): container finished" podID="7f87409d-d6ed-4313-8daa-e92892584f21" containerID="8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215" exitCode=0 Mar 12 12:45:15.436733 master-0 kubenswrapper[13984]: I0312 12:45:15.434274 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-b67fm" event={"ID":"7f87409d-d6ed-4313-8daa-e92892584f21","Type":"ContainerDied","Data":"8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215"} Mar 12 12:45:15.436733 master-0 kubenswrapper[13984]: I0312 12:45:15.434306 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-b67fm" event={"ID":"7f87409d-d6ed-4313-8daa-e92892584f21","Type":"ContainerStarted","Data":"39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb"} Mar 12 12:45:15.436733 master-0 kubenswrapper[13984]: I0312 12:45:15.435177 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:15.437640 master-0 kubenswrapper[13984]: I0312 12:45:15.437606 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7dvg9" event={"ID":"f5cd50ae-2194-4717-96f0-47b3d353c8b1","Type":"ContainerStarted","Data":"279675074c49837315e413d6048667889f309ce7943c6b6c7e38643961127ba3"} Mar 12 12:45:15.440219 master-0 kubenswrapper[13984]: I0312 12:45:15.440183 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb8736d5-b5e7-4ab6-9755-d295836ae7a4","Type":"ContainerStarted","Data":"90c9dee5be808c399d6d3e7a87ac35418acd20cd94848c687721de063f835c31"} Mar 12 12:45:15.442708 master-0 kubenswrapper[13984]: I0312 12:45:15.442634 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ea7532b5-e173-4190-a81d-a1e0d3bcd824","Type":"ContainerStarted","Data":"66af15cdce971ee9fec17909b74dbdfbc1f0d1c6a156dbe9db8c09cd3961ba26"} Mar 12 12:45:15.463268 master-0 kubenswrapper[13984]: I0312 12:45:15.463170 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" podStartSLOduration=8.463150673 podStartE2EDuration="8.463150673s" podCreationTimestamp="2026-03-12 12:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:15.462962068 +0000 UTC m=+1247.660977580" watchObservedRunningTime="2026-03-12 12:45:15.463150673 +0000 UTC m=+1247.661166165" Mar 12 12:45:15.510673 master-0 kubenswrapper[13984]: I0312 12:45:15.510587 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76f498f559-b67fm" podStartSLOduration=8.510567503 podStartE2EDuration="8.510567503s" podCreationTimestamp="2026-03-12 12:45:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:15.498998232 +0000 UTC m=+1247.697013734" watchObservedRunningTime="2026-03-12 12:45:15.510567503 +0000 UTC m=+1247.708582995" Mar 12 12:45:16.475636 master-0 kubenswrapper[13984]: I0312 12:45:16.475568 13984 generic.go:334] "Generic (PLEG): container finished" podID="095ac834-8b0e-490f-b39f-c97135436fab" containerID="10923612a4b81395e95cc6408a5b08245c7a0837c668197e6432bb9f9ee4e010" exitCode=0 Mar 12 12:45:16.476612 master-0 kubenswrapper[13984]: I0312 12:45:16.476580 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vttwx" event={"ID":"095ac834-8b0e-490f-b39f-c97135436fab","Type":"ContainerDied","Data":"10923612a4b81395e95cc6408a5b08245c7a0837c668197e6432bb9f9ee4e010"} Mar 12 12:45:17.490699 master-0 kubenswrapper[13984]: I0312 12:45:17.490585 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vttwx" event={"ID":"095ac834-8b0e-490f-b39f-c97135436fab","Type":"ContainerStarted","Data":"494e8e8d9dcc7b486e134c5262e5927d31062731aee79e077ae2d197eec73db1"} Mar 12 12:45:18.501519 master-0 kubenswrapper[13984]: I0312 12:45:18.501428 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-bjdfk" event={"ID":"95386842-d198-447d-8b98-f74826151e8b","Type":"ContainerStarted","Data":"9cd292d103ccc25cfeca342ed1a82a5cebc76fb791fbbb2c0b064b31a7d1e850"} Mar 12 12:45:18.503791 master-0 kubenswrapper[13984]: I0312 12:45:18.503756 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"b6e60ab1-d624-4df2-a3e8-a9044c72edfc","Type":"ContainerStarted","Data":"ac97c90a0f901fa72877f8f2bac6a26dbbf26726a5b1c1345d548beebdc67848"} Mar 12 12:45:18.506512 master-0 kubenswrapper[13984]: I0312 12:45:18.506416 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-vttwx" event={"ID":"095ac834-8b0e-490f-b39f-c97135436fab","Type":"ContainerStarted","Data":"1f3a4294c3238077e33c50af08c025b51841284d0ccb04d6dcb68db72e718ae7"} Mar 12 12:45:18.506800 master-0 kubenswrapper[13984]: I0312 12:45:18.506757 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:45:18.506800 master-0 kubenswrapper[13984]: I0312 12:45:18.506783 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:45:18.508018 master-0 kubenswrapper[13984]: I0312 12:45:18.507971 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"cb8736d5-b5e7-4ab6-9755-d295836ae7a4","Type":"ContainerStarted","Data":"2f3ae518baefa74e29ef63f342e6cccc789e4a4676c00a6982b80adc2fe5ce8f"} Mar 12 12:45:18.523953 master-0 kubenswrapper[13984]: I0312 12:45:18.523898 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:18.561253 master-0 kubenswrapper[13984]: I0312 12:45:18.561086 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:18.651042 master-0 kubenswrapper[13984]: I0312 12:45:18.650944 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-bjdfk" podStartSLOduration=3.192880368 podStartE2EDuration="12.650920648s" podCreationTimestamp="2026-03-12 12:45:06 +0000 UTC" firstStartedPulling="2026-03-12 12:45:08.358578597 +0000 UTC m=+1240.556594089" lastFinishedPulling="2026-03-12 12:45:17.816618877 +0000 UTC m=+1250.014634369" observedRunningTime="2026-03-12 12:45:18.580908409 +0000 UTC m=+1250.778923901" watchObservedRunningTime="2026-03-12 12:45:18.650920648 +0000 UTC m=+1250.848936150" Mar 12 12:45:18.655616 master-0 kubenswrapper[13984]: I0312 12:45:18.655522 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.449148292 podStartE2EDuration="26.655501565s" podCreationTimestamp="2026-03-12 12:44:52 +0000 UTC" firstStartedPulling="2026-03-12 12:45:03.851520082 +0000 UTC m=+1236.049535574" lastFinishedPulling="2026-03-12 12:45:18.057873345 +0000 UTC m=+1250.255888847" observedRunningTime="2026-03-12 12:45:18.649089875 +0000 UTC m=+1250.847105387" watchObservedRunningTime="2026-03-12 12:45:18.655501565 +0000 UTC m=+1250.853517067" Mar 12 12:45:18.694332 master-0 kubenswrapper[13984]: I0312 12:45:18.694218 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=8.85827881 podStartE2EDuration="22.694190951s" podCreationTimestamp="2026-03-12 12:44:56 +0000 UTC" firstStartedPulling="2026-03-12 12:45:04.253988954 +0000 UTC m=+1236.452004446" lastFinishedPulling="2026-03-12 12:45:18.089901095 +0000 UTC m=+1250.287916587" observedRunningTime="2026-03-12 12:45:18.693267259 +0000 UTC m=+1250.891282781" watchObservedRunningTime="2026-03-12 12:45:18.694190951 +0000 UTC m=+1250.892206443" Mar 12 12:45:18.725518 master-0 kubenswrapper[13984]: I0312 12:45:18.722794 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-vttwx" podStartSLOduration=14.08245602 podStartE2EDuration="24.72277689s" podCreationTimestamp="2026-03-12 12:44:54 +0000 UTC" firstStartedPulling="2026-03-12 12:45:03.253941427 +0000 UTC m=+1235.451956919" lastFinishedPulling="2026-03-12 12:45:13.894262297 +0000 UTC m=+1246.092277789" observedRunningTime="2026-03-12 12:45:18.721993932 +0000 UTC m=+1250.920009424" watchObservedRunningTime="2026-03-12 12:45:18.72277689 +0000 UTC m=+1250.920792382" Mar 12 12:45:19.520440 master-0 kubenswrapper[13984]: I0312 12:45:19.520351 13984 generic.go:334] "Generic (PLEG): container finished" podID="ea7532b5-e173-4190-a81d-a1e0d3bcd824" containerID="66af15cdce971ee9fec17909b74dbdfbc1f0d1c6a156dbe9db8c09cd3961ba26" exitCode=0 Mar 12 12:45:19.521369 master-0 kubenswrapper[13984]: I0312 12:45:19.520458 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ea7532b5-e173-4190-a81d-a1e0d3bcd824","Type":"ContainerDied","Data":"66af15cdce971ee9fec17909b74dbdfbc1f0d1c6a156dbe9db8c09cd3961ba26"} Mar 12 12:45:19.524086 master-0 kubenswrapper[13984]: I0312 12:45:19.524045 13984 generic.go:334] "Generic (PLEG): container finished" podID="cb5b76fd-5724-4d55-94b6-3c071262be24" containerID="13a16dcf10d3a770987049785c46248de3f5848dff290d09d9282c321a077dcc" exitCode=0 Mar 12 12:45:19.524956 master-0 kubenswrapper[13984]: I0312 12:45:19.524894 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb5b76fd-5724-4d55-94b6-3c071262be24","Type":"ContainerDied","Data":"13a16dcf10d3a770987049785c46248de3f5848dff290d09d9282c321a077dcc"} Mar 12 12:45:19.526460 master-0 kubenswrapper[13984]: I0312 12:45:19.526410 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:19.608203 master-0 kubenswrapper[13984]: I0312 12:45:19.608153 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 12 12:45:20.020374 master-0 kubenswrapper[13984]: I0312 12:45:20.020318 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 12 12:45:20.057579 master-0 kubenswrapper[13984]: I0312 12:45:20.057538 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 12 12:45:20.564831 master-0 kubenswrapper[13984]: I0312 12:45:20.564780 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d6ccc4b7-kchfg"] Mar 12 12:45:20.565391 master-0 kubenswrapper[13984]: I0312 12:45:20.565007 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerName="dnsmasq-dns" containerID="cri-o://cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32" gracePeriod=10 Mar 12 12:45:20.567178 master-0 kubenswrapper[13984]: I0312 12:45:20.566570 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:20.572451 master-0 kubenswrapper[13984]: I0312 12:45:20.572411 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ea7532b5-e173-4190-a81d-a1e0d3bcd824","Type":"ContainerStarted","Data":"4a1beaa604995c409705d9a7d547acf44864ad58548cd7895723541a7b5f4f9f"} Mar 12 12:45:20.575515 master-0 kubenswrapper[13984]: I0312 12:45:20.575459 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"cb5b76fd-5724-4d55-94b6-3c071262be24","Type":"ContainerStarted","Data":"37d6771c50f4c01300a1e38be2566e95aacda172080d8ec8322d16f00be96ce9"} Mar 12 12:45:20.576215 master-0 kubenswrapper[13984]: I0312 12:45:20.576189 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: I0312 12:45:20.593991 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-hbdd6"] Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: E0312 12:45:20.594392 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerName="dnsmasq-dns" Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: I0312 12:45:20.594404 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerName="dnsmasq-dns" Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: E0312 12:45:20.594415 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerName="dnsmasq-dns" Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: I0312 12:45:20.594422 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerName="dnsmasq-dns" Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: E0312 12:45:20.594438 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerName="init" Mar 12 12:45:20.595584 master-0 kubenswrapper[13984]: I0312 12:45:20.594444 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerName="init" Mar 12 12:45:20.597324 master-0 kubenswrapper[13984]: E0312 12:45:20.596520 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerName="init" Mar 12 12:45:20.597324 master-0 kubenswrapper[13984]: I0312 12:45:20.596584 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerName="init" Mar 12 12:45:20.597324 master-0 kubenswrapper[13984]: I0312 12:45:20.597007 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="75d275ee-71d3-4fef-944d-0a04b037a6fd" containerName="dnsmasq-dns" Mar 12 12:45:20.597324 master-0 kubenswrapper[13984]: I0312 12:45:20.597029 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c532467-afe9-4d74-a5e0-f8f6ac941467" containerName="dnsmasq-dns" Mar 12 12:45:20.609660 master-0 kubenswrapper[13984]: I0312 12:45:20.598062 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.628608 master-0 kubenswrapper[13984]: I0312 12:45:20.628235 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-hbdd6"] Mar 12 12:45:20.662376 master-0 kubenswrapper[13984]: I0312 12:45:20.662285 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=22.72510241 podStartE2EDuration="33.662269381s" podCreationTimestamp="2026-03-12 12:44:47 +0000 UTC" firstStartedPulling="2026-03-12 12:45:02.950563919 +0000 UTC m=+1235.148579421" lastFinishedPulling="2026-03-12 12:45:13.8877309 +0000 UTC m=+1246.085746392" observedRunningTime="2026-03-12 12:45:20.650439364 +0000 UTC m=+1252.848454856" watchObservedRunningTime="2026-03-12 12:45:20.662269381 +0000 UTC m=+1252.860284873" Mar 12 12:45:20.685048 master-0 kubenswrapper[13984]: I0312 12:45:20.684932 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 12 12:45:20.719657 master-0 kubenswrapper[13984]: I0312 12:45:20.718886 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=23.13594104 podStartE2EDuration="34.718863636s" podCreationTimestamp="2026-03-12 12:44:46 +0000 UTC" firstStartedPulling="2026-03-12 12:45:02.365071142 +0000 UTC m=+1234.563086624" lastFinishedPulling="2026-03-12 12:45:13.947993728 +0000 UTC m=+1246.146009220" observedRunningTime="2026-03-12 12:45:20.709283741 +0000 UTC m=+1252.907299253" watchObservedRunningTime="2026-03-12 12:45:20.718863636 +0000 UTC m=+1252.916879128" Mar 12 12:45:20.763746 master-0 kubenswrapper[13984]: I0312 12:45:20.763682 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmf7j\" (UniqueName: \"kubernetes.io/projected/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-kube-api-access-tmf7j\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.763979 master-0 kubenswrapper[13984]: I0312 12:45:20.763826 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.763979 master-0 kubenswrapper[13984]: I0312 12:45:20.763913 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.764047 master-0 kubenswrapper[13984]: I0312 12:45:20.763985 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-config\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.764086 master-0 kubenswrapper[13984]: I0312 12:45:20.764043 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.868631 master-0 kubenswrapper[13984]: I0312 12:45:20.868436 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmf7j\" (UniqueName: \"kubernetes.io/projected/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-kube-api-access-tmf7j\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.870008 master-0 kubenswrapper[13984]: I0312 12:45:20.869154 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.870008 master-0 kubenswrapper[13984]: I0312 12:45:20.869253 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.870008 master-0 kubenswrapper[13984]: I0312 12:45:20.869327 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-config\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.870008 master-0 kubenswrapper[13984]: I0312 12:45:20.869414 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.872071 master-0 kubenswrapper[13984]: I0312 12:45:20.871987 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.872986 master-0 kubenswrapper[13984]: I0312 12:45:20.872512 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-sb\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.872986 master-0 kubenswrapper[13984]: I0312 12:45:20.872929 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-config\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.874840 master-0 kubenswrapper[13984]: I0312 12:45:20.874695 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-dns-svc\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:20.899032 master-0 kubenswrapper[13984]: I0312 12:45:20.898902 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmf7j\" (UniqueName: \"kubernetes.io/projected/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-kube-api-access-tmf7j\") pod \"dnsmasq-dns-5bf8b865dc-hbdd6\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:21.017635 master-0 kubenswrapper[13984]: I0312 12:45:21.017557 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:21.025524 master-0 kubenswrapper[13984]: I0312 12:45:21.025240 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 12 12:45:21.035748 master-0 kubenswrapper[13984]: I0312 12:45:21.030209 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 12:45:21.036241 master-0 kubenswrapper[13984]: I0312 12:45:21.035978 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 12 12:45:21.045444 master-0 kubenswrapper[13984]: I0312 12:45:21.044343 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 12 12:45:21.071407 master-0 kubenswrapper[13984]: I0312 12:45:21.065889 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 12 12:45:21.075668 master-0 kubenswrapper[13984]: I0312 12:45:21.075614 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.075879 master-0 kubenswrapper[13984]: I0312 12:45:21.075736 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8497\" (UniqueName: \"kubernetes.io/projected/8c279ec7-a696-4020-a396-bda3ce501266-kube-api-access-m8497\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.075879 master-0 kubenswrapper[13984]: I0312 12:45:21.075782 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c279ec7-a696-4020-a396-bda3ce501266-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.075879 master-0 kubenswrapper[13984]: I0312 12:45:21.075811 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.075879 master-0 kubenswrapper[13984]: I0312 12:45:21.075846 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c279ec7-a696-4020-a396-bda3ce501266-config\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.076226 master-0 kubenswrapper[13984]: I0312 12:45:21.075960 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c279ec7-a696-4020-a396-bda3ce501266-scripts\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.076226 master-0 kubenswrapper[13984]: I0312 12:45:21.076044 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.131240 master-0 kubenswrapper[13984]: I0312 12:45:21.131168 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 12:45:21.181637 master-0 kubenswrapper[13984]: I0312 12:45:21.181537 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.181974 master-0 kubenswrapper[13984]: I0312 12:45:21.181729 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8497\" (UniqueName: \"kubernetes.io/projected/8c279ec7-a696-4020-a396-bda3ce501266-kube-api-access-m8497\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.181974 master-0 kubenswrapper[13984]: I0312 12:45:21.181816 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c279ec7-a696-4020-a396-bda3ce501266-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.182360 master-0 kubenswrapper[13984]: I0312 12:45:21.182330 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.182428 master-0 kubenswrapper[13984]: I0312 12:45:21.182386 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c279ec7-a696-4020-a396-bda3ce501266-config\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.182658 master-0 kubenswrapper[13984]: I0312 12:45:21.182624 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c279ec7-a696-4020-a396-bda3ce501266-scripts\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.182786 master-0 kubenswrapper[13984]: I0312 12:45:21.182758 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.186091 master-0 kubenswrapper[13984]: I0312 12:45:21.185926 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c279ec7-a696-4020-a396-bda3ce501266-config\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.186663 master-0 kubenswrapper[13984]: I0312 12:45:21.186618 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c279ec7-a696-4020-a396-bda3ce501266-scripts\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.187038 master-0 kubenswrapper[13984]: I0312 12:45:21.187005 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.188511 master-0 kubenswrapper[13984]: I0312 12:45:21.187673 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/8c279ec7-a696-4020-a396-bda3ce501266-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.192384 master-0 kubenswrapper[13984]: I0312 12:45:21.192290 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.196825 master-0 kubenswrapper[13984]: I0312 12:45:21.193758 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c279ec7-a696-4020-a396-bda3ce501266-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.212070 master-0 kubenswrapper[13984]: I0312 12:45:21.211991 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8497\" (UniqueName: \"kubernetes.io/projected/8c279ec7-a696-4020-a396-bda3ce501266-kube-api-access-m8497\") pod \"ovn-northd-0\" (UID: \"8c279ec7-a696-4020-a396-bda3ce501266\") " pod="openstack/ovn-northd-0" Mar 12 12:45:21.328692 master-0 kubenswrapper[13984]: I0312 12:45:21.328336 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:21.396610 master-0 kubenswrapper[13984]: I0312 12:45:21.384843 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 12 12:45:21.396610 master-0 kubenswrapper[13984]: I0312 12:45:21.389789 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-config\") pod \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " Mar 12 12:45:21.396610 master-0 kubenswrapper[13984]: I0312 12:45:21.389982 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-ovsdbserver-nb\") pod \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " Mar 12 12:45:21.396610 master-0 kubenswrapper[13984]: I0312 12:45:21.390075 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-dns-svc\") pod \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " Mar 12 12:45:21.396610 master-0 kubenswrapper[13984]: I0312 12:45:21.390270 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kr7bd\" (UniqueName: \"kubernetes.io/projected/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-kube-api-access-kr7bd\") pod \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\" (UID: \"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d\") " Mar 12 12:45:21.402286 master-0 kubenswrapper[13984]: I0312 12:45:21.401086 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-kube-api-access-kr7bd" (OuterVolumeSpecName: "kube-api-access-kr7bd") pod "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" (UID: "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d"). InnerVolumeSpecName "kube-api-access-kr7bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:21.453274 master-0 kubenswrapper[13984]: I0312 12:45:21.453221 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-config" (OuterVolumeSpecName: "config") pod "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" (UID: "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:21.461026 master-0 kubenswrapper[13984]: I0312 12:45:21.460982 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" (UID: "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:21.462045 master-0 kubenswrapper[13984]: I0312 12:45:21.461929 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" (UID: "5f6e7440-1cbe-4a21-a714-c21fe6f3b11d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:21.497323 master-0 kubenswrapper[13984]: I0312 12:45:21.493230 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:21.497323 master-0 kubenswrapper[13984]: I0312 12:45:21.493288 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:21.497323 master-0 kubenswrapper[13984]: I0312 12:45:21.493303 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kr7bd\" (UniqueName: \"kubernetes.io/projected/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-kube-api-access-kr7bd\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:21.497323 master-0 kubenswrapper[13984]: I0312 12:45:21.493315 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:21.602355 master-0 kubenswrapper[13984]: I0312 12:45:21.602250 13984 generic.go:334] "Generic (PLEG): container finished" podID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerID="cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32" exitCode=0 Mar 12 12:45:21.602770 master-0 kubenswrapper[13984]: I0312 12:45:21.602410 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" Mar 12 12:45:21.602770 master-0 kubenswrapper[13984]: I0312 12:45:21.602503 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" event={"ID":"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d","Type":"ContainerDied","Data":"cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32"} Mar 12 12:45:21.602770 master-0 kubenswrapper[13984]: I0312 12:45:21.602537 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79d6ccc4b7-kchfg" event={"ID":"5f6e7440-1cbe-4a21-a714-c21fe6f3b11d","Type":"ContainerDied","Data":"485db5fb52663a379d50afce51ea53cdee8bd219fc86008daa2e777934d12d21"} Mar 12 12:45:21.602770 master-0 kubenswrapper[13984]: I0312 12:45:21.602555 13984 scope.go:117] "RemoveContainer" containerID="cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32" Mar 12 12:45:21.711498 master-0 kubenswrapper[13984]: I0312 12:45:21.706520 13984 scope.go:117] "RemoveContainer" containerID="4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4" Mar 12 12:45:21.711498 master-0 kubenswrapper[13984]: I0312 12:45:21.710815 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79d6ccc4b7-kchfg"] Mar 12 12:45:21.725375 master-0 kubenswrapper[13984]: W0312 12:45:21.725308 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9a07dca_acf8_4a07_82f7_e5ebe3ff8d4c.slice/crio-f59f6bada1979cce50108ca95b76ad0f1aa4160d4e8bbee0c6f6ef94ff79e50b WatchSource:0}: Error finding container f59f6bada1979cce50108ca95b76ad0f1aa4160d4e8bbee0c6f6ef94ff79e50b: Status 404 returned error can't find the container with id f59f6bada1979cce50108ca95b76ad0f1aa4160d4e8bbee0c6f6ef94ff79e50b Mar 12 12:45:21.734775 master-0 kubenswrapper[13984]: I0312 12:45:21.734217 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79d6ccc4b7-kchfg"] Mar 12 12:45:21.737839 master-0 kubenswrapper[13984]: I0312 12:45:21.737048 13984 scope.go:117] "RemoveContainer" containerID="cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32" Mar 12 12:45:21.752696 master-0 kubenswrapper[13984]: E0312 12:45:21.752524 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32\": container with ID starting with cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32 not found: ID does not exist" containerID="cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32" Mar 12 12:45:21.752696 master-0 kubenswrapper[13984]: I0312 12:45:21.752578 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32"} err="failed to get container status \"cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32\": rpc error: code = NotFound desc = could not find container \"cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32\": container with ID starting with cec58fb1bc3d9970f1700236adf4095fc0fd11ac06ecd4113f6402d379064f32 not found: ID does not exist" Mar 12 12:45:21.752696 master-0 kubenswrapper[13984]: I0312 12:45:21.752604 13984 scope.go:117] "RemoveContainer" containerID="4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4" Mar 12 12:45:21.757866 master-0 kubenswrapper[13984]: E0312 12:45:21.756596 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4\": container with ID starting with 4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4 not found: ID does not exist" containerID="4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4" Mar 12 12:45:21.757866 master-0 kubenswrapper[13984]: I0312 12:45:21.756637 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4"} err="failed to get container status \"4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4\": rpc error: code = NotFound desc = could not find container \"4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4\": container with ID starting with 4b8720fcbdbfb76efc0fe85922ed61868283f2ca37edaeebc9133d24d5c7b7f4 not found: ID does not exist" Mar 12 12:45:21.772902 master-0 kubenswrapper[13984]: I0312 12:45:21.772800 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-hbdd6"] Mar 12 12:45:22.004887 master-0 kubenswrapper[13984]: I0312 12:45:22.004816 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" path="/var/lib/kubelet/pods/5f6e7440-1cbe-4a21-a714-c21fe6f3b11d/volumes" Mar 12 12:45:22.061156 master-0 kubenswrapper[13984]: I0312 12:45:22.061102 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 12 12:45:22.620634 master-0 kubenswrapper[13984]: I0312 12:45:22.620555 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8c279ec7-a696-4020-a396-bda3ce501266","Type":"ContainerStarted","Data":"fcba00d030f9df7d71fb648bc7900b69a55ddf5637b71e9ee8c6a3e2180d4702"} Mar 12 12:45:22.626468 master-0 kubenswrapper[13984]: I0312 12:45:22.626403 13984 generic.go:334] "Generic (PLEG): container finished" podID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerID="48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f" exitCode=0 Mar 12 12:45:22.626623 master-0 kubenswrapper[13984]: I0312 12:45:22.626504 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" event={"ID":"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c","Type":"ContainerDied","Data":"48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f"} Mar 12 12:45:22.626623 master-0 kubenswrapper[13984]: I0312 12:45:22.626585 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" event={"ID":"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c","Type":"ContainerStarted","Data":"f59f6bada1979cce50108ca95b76ad0f1aa4160d4e8bbee0c6f6ef94ff79e50b"} Mar 12 12:45:22.667930 master-0 kubenswrapper[13984]: I0312 12:45:22.667622 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 12 12:45:22.669711 master-0 kubenswrapper[13984]: I0312 12:45:22.668982 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 12 12:45:22.727858 master-0 kubenswrapper[13984]: I0312 12:45:22.727737 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 12 12:45:22.729682 master-0 kubenswrapper[13984]: E0312 12:45:22.728767 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerName="init" Mar 12 12:45:22.729682 master-0 kubenswrapper[13984]: I0312 12:45:22.728794 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerName="init" Mar 12 12:45:22.729682 master-0 kubenswrapper[13984]: E0312 12:45:22.728844 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerName="dnsmasq-dns" Mar 12 12:45:22.729682 master-0 kubenswrapper[13984]: I0312 12:45:22.728853 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerName="dnsmasq-dns" Mar 12 12:45:22.729682 master-0 kubenswrapper[13984]: I0312 12:45:22.729205 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f6e7440-1cbe-4a21-a714-c21fe6f3b11d" containerName="dnsmasq-dns" Mar 12 12:45:22.740170 master-0 kubenswrapper[13984]: I0312 12:45:22.739653 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 12:45:22.743703 master-0 kubenswrapper[13984]: I0312 12:45:22.743668 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 12 12:45:22.744231 master-0 kubenswrapper[13984]: I0312 12:45:22.744212 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 12 12:45:22.744395 master-0 kubenswrapper[13984]: I0312 12:45:22.744359 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 12 12:45:22.759353 master-0 kubenswrapper[13984]: I0312 12:45:22.759287 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 12:45:22.918760 master-0 kubenswrapper[13984]: I0312 12:45:22.918700 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:22.936709 master-0 kubenswrapper[13984]: I0312 12:45:22.936643 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e2764a-7fca-4b4a-ae89-131f181cdeb9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:22.936965 master-0 kubenswrapper[13984]: I0312 12:45:22.936730 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/92e2764a-7fca-4b4a-ae89-131f181cdeb9-cache\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:22.936965 master-0 kubenswrapper[13984]: I0312 12:45:22.936813 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2a67d8c6-2b49-4c2d-9a78-92253d84ac5b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^97434091-4a73-4323-ac95-7f9ca05dc337\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:22.936965 master-0 kubenswrapper[13984]: I0312 12:45:22.936892 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:22.937083 master-0 kubenswrapper[13984]: I0312 12:45:22.936977 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d9lc\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-kube-api-access-7d9lc\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:22.937083 master-0 kubenswrapper[13984]: I0312 12:45:22.937075 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/92e2764a-7fca-4b4a-ae89-131f181cdeb9-lock\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.040427 master-0 kubenswrapper[13984]: I0312 12:45:23.040055 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.040427 master-0 kubenswrapper[13984]: I0312 12:45:23.040173 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d9lc\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-kube-api-access-7d9lc\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.040427 master-0 kubenswrapper[13984]: I0312 12:45:23.040219 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/92e2764a-7fca-4b4a-ae89-131f181cdeb9-lock\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.040427 master-0 kubenswrapper[13984]: I0312 12:45:23.040292 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e2764a-7fca-4b4a-ae89-131f181cdeb9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.040427 master-0 kubenswrapper[13984]: I0312 12:45:23.040321 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/92e2764a-7fca-4b4a-ae89-131f181cdeb9-cache\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.040427 master-0 kubenswrapper[13984]: I0312 12:45:23.040350 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2a67d8c6-2b49-4c2d-9a78-92253d84ac5b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^97434091-4a73-4323-ac95-7f9ca05dc337\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.042096 master-0 kubenswrapper[13984]: E0312 12:45:23.041436 13984 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 12:45:23.042096 master-0 kubenswrapper[13984]: E0312 12:45:23.041452 13984 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 12:45:23.042096 master-0 kubenswrapper[13984]: E0312 12:45:23.041513 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift podName:92e2764a-7fca-4b4a-ae89-131f181cdeb9 nodeName:}" failed. No retries permitted until 2026-03-12 12:45:23.541478756 +0000 UTC m=+1255.739494248 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift") pod "swift-storage-0" (UID: "92e2764a-7fca-4b4a-ae89-131f181cdeb9") : configmap "swift-ring-files" not found Mar 12 12:45:23.044041 master-0 kubenswrapper[13984]: I0312 12:45:23.043753 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/92e2764a-7fca-4b4a-ae89-131f181cdeb9-lock\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.044041 master-0 kubenswrapper[13984]: I0312 12:45:23.044002 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/92e2764a-7fca-4b4a-ae89-131f181cdeb9-cache\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.049157 master-0 kubenswrapper[13984]: I0312 12:45:23.049091 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:45:23.049157 master-0 kubenswrapper[13984]: I0312 12:45:23.049134 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2a67d8c6-2b49-4c2d-9a78-92253d84ac5b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^97434091-4a73-4323-ac95-7f9ca05dc337\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/e39a6a4d2dc0433f0454bfefc686870caface7af6899ccdc400ff5a1aba36f08/globalmount\"" pod="openstack/swift-storage-0" Mar 12 12:45:23.054981 master-0 kubenswrapper[13984]: I0312 12:45:23.054942 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92e2764a-7fca-4b4a-ae89-131f181cdeb9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.073561 master-0 kubenswrapper[13984]: I0312 12:45:23.073462 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d9lc\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-kube-api-access-7d9lc\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.524160 master-0 kubenswrapper[13984]: I0312 12:45:23.522859 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-2mjxx"] Mar 12 12:45:23.527508 master-0 kubenswrapper[13984]: I0312 12:45:23.525050 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.535508 master-0 kubenswrapper[13984]: I0312 12:45:23.530433 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 12 12:45:23.535508 master-0 kubenswrapper[13984]: I0312 12:45:23.530535 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 12 12:45:23.535508 master-0 kubenswrapper[13984]: I0312 12:45:23.530561 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 12 12:45:23.535508 master-0 kubenswrapper[13984]: I0312 12:45:23.532270 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2mjxx"] Mar 12 12:45:23.561008 master-0 kubenswrapper[13984]: I0312 12:45:23.560251 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:23.561008 master-0 kubenswrapper[13984]: E0312 12:45:23.560536 13984 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 12:45:23.561008 master-0 kubenswrapper[13984]: E0312 12:45:23.560558 13984 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 12:45:23.561008 master-0 kubenswrapper[13984]: E0312 12:45:23.560620 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift podName:92e2764a-7fca-4b4a-ae89-131f181cdeb9 nodeName:}" failed. No retries permitted until 2026-03-12 12:45:24.560601919 +0000 UTC m=+1256.758617421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift") pod "swift-storage-0" (UID: "92e2764a-7fca-4b4a-ae89-131f181cdeb9") : configmap "swift-ring-files" not found Mar 12 12:45:23.637766 master-0 kubenswrapper[13984]: I0312 12:45:23.637642 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" event={"ID":"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c","Type":"ContainerStarted","Data":"a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92"} Mar 12 12:45:23.638248 master-0 kubenswrapper[13984]: I0312 12:45:23.637818 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:23.640437 master-0 kubenswrapper[13984]: I0312 12:45:23.639925 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8c279ec7-a696-4020-a396-bda3ce501266","Type":"ContainerStarted","Data":"b79b1a43980d252ca11101851de093bedd7ae24b590fc700fdc5c4a13eecb99f"} Mar 12 12:45:23.661745 master-0 kubenswrapper[13984]: I0312 12:45:23.661683 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9k8q\" (UniqueName: \"kubernetes.io/projected/5767f9cc-96f2-4309-a6da-e89247924459-kube-api-access-d9k8q\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.661943 master-0 kubenswrapper[13984]: I0312 12:45:23.661906 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-dispersionconf\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.662043 master-0 kubenswrapper[13984]: I0312 12:45:23.662001 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-combined-ca-bundle\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.662043 master-0 kubenswrapper[13984]: I0312 12:45:23.662037 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-scripts\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.662124 master-0 kubenswrapper[13984]: I0312 12:45:23.662106 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-swiftconf\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.662214 master-0 kubenswrapper[13984]: I0312 12:45:23.662181 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-ring-data-devices\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.662253 master-0 kubenswrapper[13984]: I0312 12:45:23.662217 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5767f9cc-96f2-4309-a6da-e89247924459-etc-swift\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.693694 master-0 kubenswrapper[13984]: I0312 12:45:23.693638 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 12 12:45:23.693694 master-0 kubenswrapper[13984]: I0312 12:45:23.693705 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 12 12:45:23.764009 master-0 kubenswrapper[13984]: I0312 12:45:23.763706 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-combined-ca-bundle\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.764009 master-0 kubenswrapper[13984]: I0312 12:45:23.763769 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-scripts\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.764009 master-0 kubenswrapper[13984]: I0312 12:45:23.763813 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-swiftconf\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.764009 master-0 kubenswrapper[13984]: I0312 12:45:23.763876 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-ring-data-devices\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.764009 master-0 kubenswrapper[13984]: I0312 12:45:23.763907 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5767f9cc-96f2-4309-a6da-e89247924459-etc-swift\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.764009 master-0 kubenswrapper[13984]: I0312 12:45:23.763981 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9k8q\" (UniqueName: \"kubernetes.io/projected/5767f9cc-96f2-4309-a6da-e89247924459-kube-api-access-d9k8q\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.764462 master-0 kubenswrapper[13984]: I0312 12:45:23.764077 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-dispersionconf\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.765185 master-0 kubenswrapper[13984]: I0312 12:45:23.765134 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5767f9cc-96f2-4309-a6da-e89247924459-etc-swift\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.766045 master-0 kubenswrapper[13984]: I0312 12:45:23.765991 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-ring-data-devices\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.767076 master-0 kubenswrapper[13984]: I0312 12:45:23.767022 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-dispersionconf\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.767618 master-0 kubenswrapper[13984]: I0312 12:45:23.767552 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-scripts\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.768115 master-0 kubenswrapper[13984]: I0312 12:45:23.768081 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-swiftconf\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.768955 master-0 kubenswrapper[13984]: I0312 12:45:23.768812 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-combined-ca-bundle\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.834083 master-0 kubenswrapper[13984]: I0312 12:45:23.834036 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9k8q\" (UniqueName: \"kubernetes.io/projected/5767f9cc-96f2-4309-a6da-e89247924459-kube-api-access-d9k8q\") pod \"swift-ring-rebalance-2mjxx\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:23.848601 master-0 kubenswrapper[13984]: I0312 12:45:23.848547 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:24.284930 master-0 kubenswrapper[13984]: I0312 12:45:24.284840 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" podStartSLOduration=4.284817213 podStartE2EDuration="4.284817213s" podCreationTimestamp="2026-03-12 12:45:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:24.27701528 +0000 UTC m=+1256.475030792" watchObservedRunningTime="2026-03-12 12:45:24.284817213 +0000 UTC m=+1256.482832705" Mar 12 12:45:24.592562 master-0 kubenswrapper[13984]: I0312 12:45:24.585670 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:24.592562 master-0 kubenswrapper[13984]: E0312 12:45:24.585906 13984 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 12:45:24.592562 master-0 kubenswrapper[13984]: E0312 12:45:24.585923 13984 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 12:45:24.592562 master-0 kubenswrapper[13984]: E0312 12:45:24.585973 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift podName:92e2764a-7fca-4b4a-ae89-131f181cdeb9 nodeName:}" failed. No retries permitted until 2026-03-12 12:45:26.585956212 +0000 UTC m=+1258.783971714 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift") pod "swift-storage-0" (UID: "92e2764a-7fca-4b4a-ae89-131f181cdeb9") : configmap "swift-ring-files" not found Mar 12 12:45:24.652727 master-0 kubenswrapper[13984]: I0312 12:45:24.652666 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"8c279ec7-a696-4020-a396-bda3ce501266","Type":"ContainerStarted","Data":"76f061d16cf4103385438e8fc230703ab17c1df5139d4eef0e3615641a7db88c"} Mar 12 12:45:24.693439 master-0 kubenswrapper[13984]: I0312 12:45:24.693386 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 12 12:45:24.760766 master-0 kubenswrapper[13984]: I0312 12:45:24.760715 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 12 12:45:24.785026 master-0 kubenswrapper[13984]: I0312 12:45:24.784944 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-2mjxx"] Mar 12 12:45:24.797681 master-0 kubenswrapper[13984]: W0312 12:45:24.793447 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5767f9cc_96f2_4309_a6da_e89247924459.slice/crio-26c3fb47654e05adf7ddf25bc12489f3503b85020234f85cda8f5be4961dd9c4 WatchSource:0}: Error finding container 26c3fb47654e05adf7ddf25bc12489f3503b85020234f85cda8f5be4961dd9c4: Status 404 returned error can't find the container with id 26c3fb47654e05adf7ddf25bc12489f3503b85020234f85cda8f5be4961dd9c4 Mar 12 12:45:24.832223 master-0 kubenswrapper[13984]: I0312 12:45:24.831976 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=3.558876319 podStartE2EDuration="4.831956301s" podCreationTimestamp="2026-03-12 12:45:20 +0000 UTC" firstStartedPulling="2026-03-12 12:45:22.089087352 +0000 UTC m=+1254.287102844" lastFinishedPulling="2026-03-12 12:45:23.362167334 +0000 UTC m=+1255.560182826" observedRunningTime="2026-03-12 12:45:24.826174535 +0000 UTC m=+1257.024190027" watchObservedRunningTime="2026-03-12 12:45:24.831956301 +0000 UTC m=+1257.029971793" Mar 12 12:45:25.478970 master-0 kubenswrapper[13984]: I0312 12:45:25.478371 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2a67d8c6-2b49-4c2d-9a78-92253d84ac5b\" (UniqueName: \"kubernetes.io/csi/topolvm.io^97434091-4a73-4323-ac95-7f9ca05dc337\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:25.677573 master-0 kubenswrapper[13984]: I0312 12:45:25.676987 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2mjxx" event={"ID":"5767f9cc-96f2-4309-a6da-e89247924459","Type":"ContainerStarted","Data":"26c3fb47654e05adf7ddf25bc12489f3503b85020234f85cda8f5be4961dd9c4"} Mar 12 12:45:25.683501 master-0 kubenswrapper[13984]: I0312 12:45:25.678142 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 12 12:45:26.627271 master-0 kubenswrapper[13984]: I0312 12:45:26.627222 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:26.628175 master-0 kubenswrapper[13984]: E0312 12:45:26.627725 13984 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 12:45:26.628254 master-0 kubenswrapper[13984]: E0312 12:45:26.628241 13984 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 12:45:26.628358 master-0 kubenswrapper[13984]: E0312 12:45:26.628347 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift podName:92e2764a-7fca-4b4a-ae89-131f181cdeb9 nodeName:}" failed. No retries permitted until 2026-03-12 12:45:30.628330222 +0000 UTC m=+1262.826345714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift") pod "swift-storage-0" (UID: "92e2764a-7fca-4b4a-ae89-131f181cdeb9") : configmap "swift-ring-files" not found Mar 12 12:45:26.872638 master-0 kubenswrapper[13984]: I0312 12:45:26.872592 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 12 12:45:26.956826 master-0 kubenswrapper[13984]: I0312 12:45:26.956779 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 12 12:45:27.370568 master-0 kubenswrapper[13984]: I0312 12:45:27.370443 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-wgcnp"] Mar 12 12:45:27.371884 master-0 kubenswrapper[13984]: I0312 12:45:27.371854 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.373751 master-0 kubenswrapper[13984]: I0312 12:45:27.373705 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 12 12:45:27.381343 master-0 kubenswrapper[13984]: I0312 12:45:27.381219 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wgcnp"] Mar 12 12:45:27.546856 master-0 kubenswrapper[13984]: I0312 12:45:27.546803 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lmgh\" (UniqueName: \"kubernetes.io/projected/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-kube-api-access-8lmgh\") pod \"root-account-create-update-wgcnp\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.547108 master-0 kubenswrapper[13984]: I0312 12:45:27.546873 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-operator-scripts\") pod \"root-account-create-update-wgcnp\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.649609 master-0 kubenswrapper[13984]: I0312 12:45:27.649492 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lmgh\" (UniqueName: \"kubernetes.io/projected/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-kube-api-access-8lmgh\") pod \"root-account-create-update-wgcnp\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.649609 master-0 kubenswrapper[13984]: I0312 12:45:27.649561 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-operator-scripts\") pod \"root-account-create-update-wgcnp\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.653592 master-0 kubenswrapper[13984]: I0312 12:45:27.653557 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-operator-scripts\") pod \"root-account-create-update-wgcnp\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.667619 master-0 kubenswrapper[13984]: I0312 12:45:27.667577 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lmgh\" (UniqueName: \"kubernetes.io/projected/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-kube-api-access-8lmgh\") pod \"root-account-create-update-wgcnp\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:27.691895 master-0 kubenswrapper[13984]: I0312 12:45:27.691837 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:29.070075 master-0 kubenswrapper[13984]: I0312 12:45:29.069941 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-wgcnp"] Mar 12 12:45:29.092055 master-0 kubenswrapper[13984]: I0312 12:45:29.091019 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 12 12:45:29.096584 master-0 kubenswrapper[13984]: I0312 12:45:29.096546 13984 trace.go:236] Trace[1289721739]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (12-Mar-2026 12:45:27.961) (total time: 1134ms): Mar 12 12:45:29.096584 master-0 kubenswrapper[13984]: Trace[1289721739]: [1.134887547s] [1.134887547s] END Mar 12 12:45:29.339450 master-0 kubenswrapper[13984]: I0312 12:45:29.339379 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fhfwn"] Mar 12 12:45:29.341095 master-0 kubenswrapper[13984]: I0312 12:45:29.341059 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.349903 master-0 kubenswrapper[13984]: I0312 12:45:29.349821 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fhfwn"] Mar 12 12:45:29.413359 master-0 kubenswrapper[13984]: I0312 12:45:29.413278 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwskc\" (UniqueName: \"kubernetes.io/projected/2635bca8-9408-45c1-b88c-3634684d244a-kube-api-access-lwskc\") pod \"keystone-db-create-fhfwn\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.413600 master-0 kubenswrapper[13984]: I0312 12:45:29.413400 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2635bca8-9408-45c1-b88c-3634684d244a-operator-scripts\") pod \"keystone-db-create-fhfwn\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.461425 master-0 kubenswrapper[13984]: I0312 12:45:29.461016 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-d62b-account-create-update-lr46l"] Mar 12 12:45:29.462453 master-0 kubenswrapper[13984]: I0312 12:45:29.462415 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.465133 master-0 kubenswrapper[13984]: I0312 12:45:29.465099 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 12 12:45:29.498151 master-0 kubenswrapper[13984]: I0312 12:45:29.498064 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d62b-account-create-update-lr46l"] Mar 12 12:45:29.516578 master-0 kubenswrapper[13984]: I0312 12:45:29.515170 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwskc\" (UniqueName: \"kubernetes.io/projected/2635bca8-9408-45c1-b88c-3634684d244a-kube-api-access-lwskc\") pod \"keystone-db-create-fhfwn\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.516578 master-0 kubenswrapper[13984]: I0312 12:45:29.515307 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2635bca8-9408-45c1-b88c-3634684d244a-operator-scripts\") pod \"keystone-db-create-fhfwn\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.516835 master-0 kubenswrapper[13984]: I0312 12:45:29.516780 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2635bca8-9408-45c1-b88c-3634684d244a-operator-scripts\") pod \"keystone-db-create-fhfwn\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.533430 master-0 kubenswrapper[13984]: I0312 12:45:29.532832 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwskc\" (UniqueName: \"kubernetes.io/projected/2635bca8-9408-45c1-b88c-3634684d244a-kube-api-access-lwskc\") pod \"keystone-db-create-fhfwn\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.570951 master-0 kubenswrapper[13984]: I0312 12:45:29.570874 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7mhht"] Mar 12 12:45:29.574448 master-0 kubenswrapper[13984]: I0312 12:45:29.572259 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.595947 master-0 kubenswrapper[13984]: I0312 12:45:29.595881 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7mhht"] Mar 12 12:45:29.617369 master-0 kubenswrapper[13984]: I0312 12:45:29.617274 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f71885-a3f0-4897-bc6c-bfd657a35108-operator-scripts\") pod \"keystone-d62b-account-create-update-lr46l\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.617771 master-0 kubenswrapper[13984]: I0312 12:45:29.617511 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgk9n\" (UniqueName: \"kubernetes.io/projected/b6f71885-a3f0-4897-bc6c-bfd657a35108-kube-api-access-kgk9n\") pod \"keystone-d62b-account-create-update-lr46l\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.661228 master-0 kubenswrapper[13984]: I0312 12:45:29.659157 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:29.719306 master-0 kubenswrapper[13984]: I0312 12:45:29.719178 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-1719-account-create-update-6wbfk"] Mar 12 12:45:29.721021 master-0 kubenswrapper[13984]: I0312 12:45:29.720971 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:29.721559 master-0 kubenswrapper[13984]: I0312 12:45:29.721465 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f71885-a3f0-4897-bc6c-bfd657a35108-operator-scripts\") pod \"keystone-d62b-account-create-update-lr46l\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.721631 master-0 kubenswrapper[13984]: I0312 12:45:29.721606 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-operator-scripts\") pod \"placement-db-create-7mhht\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.721675 master-0 kubenswrapper[13984]: I0312 12:45:29.721663 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgk9n\" (UniqueName: \"kubernetes.io/projected/b6f71885-a3f0-4897-bc6c-bfd657a35108-kube-api-access-kgk9n\") pod \"keystone-d62b-account-create-update-lr46l\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.721727 master-0 kubenswrapper[13984]: I0312 12:45:29.721709 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxl28\" (UniqueName: \"kubernetes.io/projected/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-kube-api-access-wxl28\") pod \"placement-db-create-7mhht\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.723558 master-0 kubenswrapper[13984]: I0312 12:45:29.723517 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f71885-a3f0-4897-bc6c-bfd657a35108-operator-scripts\") pod \"keystone-d62b-account-create-update-lr46l\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.730735 master-0 kubenswrapper[13984]: I0312 12:45:29.728173 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 12 12:45:29.742150 master-0 kubenswrapper[13984]: I0312 12:45:29.741079 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1719-account-create-update-6wbfk"] Mar 12 12:45:29.758591 master-0 kubenswrapper[13984]: I0312 12:45:29.758520 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2mjxx" event={"ID":"5767f9cc-96f2-4309-a6da-e89247924459","Type":"ContainerStarted","Data":"b9354b78aa5b677786c949a06dfde921f1036d738cc9b853302ab77cc909dde4"} Mar 12 12:45:29.761674 master-0 kubenswrapper[13984]: I0312 12:45:29.761558 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" containerID="97baea7771b388f78ed90c0c95190d2450394fc3f19b93d18c4699807735a2fe" exitCode=0 Mar 12 12:45:29.761674 master-0 kubenswrapper[13984]: I0312 12:45:29.761613 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wgcnp" event={"ID":"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f","Type":"ContainerDied","Data":"97baea7771b388f78ed90c0c95190d2450394fc3f19b93d18c4699807735a2fe"} Mar 12 12:45:29.761674 master-0 kubenswrapper[13984]: I0312 12:45:29.761638 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wgcnp" event={"ID":"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f","Type":"ContainerStarted","Data":"58c28f554050dd2490a9775602397157929550d0a58383edc1b662eedd810dba"} Mar 12 12:45:29.770636 master-0 kubenswrapper[13984]: I0312 12:45:29.769046 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgk9n\" (UniqueName: \"kubernetes.io/projected/b6f71885-a3f0-4897-bc6c-bfd657a35108-kube-api-access-kgk9n\") pod \"keystone-d62b-account-create-update-lr46l\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.781325 master-0 kubenswrapper[13984]: I0312 12:45:29.781235 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-2mjxx" podStartSLOduration=2.648951405 podStartE2EDuration="6.781219069s" podCreationTimestamp="2026-03-12 12:45:23 +0000 UTC" firstStartedPulling="2026-03-12 12:45:24.796093701 +0000 UTC m=+1256.994109203" lastFinishedPulling="2026-03-12 12:45:28.928361385 +0000 UTC m=+1261.126376867" observedRunningTime="2026-03-12 12:45:29.779804205 +0000 UTC m=+1261.977819697" watchObservedRunningTime="2026-03-12 12:45:29.781219069 +0000 UTC m=+1261.979234561" Mar 12 12:45:29.789285 master-0 kubenswrapper[13984]: I0312 12:45:29.789205 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:29.823586 master-0 kubenswrapper[13984]: I0312 12:45:29.823529 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-operator-scripts\") pod \"placement-db-create-7mhht\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.824505 master-0 kubenswrapper[13984]: I0312 12:45:29.824452 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-operator-scripts\") pod \"placement-db-create-7mhht\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.837437 master-0 kubenswrapper[13984]: I0312 12:45:29.823816 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxl28\" (UniqueName: \"kubernetes.io/projected/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-kube-api-access-wxl28\") pod \"placement-db-create-7mhht\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.838215 master-0 kubenswrapper[13984]: I0312 12:45:29.838159 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz69q\" (UniqueName: \"kubernetes.io/projected/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-kube-api-access-dz69q\") pod \"placement-1719-account-create-update-6wbfk\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:29.838376 master-0 kubenswrapper[13984]: I0312 12:45:29.838345 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-operator-scripts\") pod \"placement-1719-account-create-update-6wbfk\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:29.840335 master-0 kubenswrapper[13984]: I0312 12:45:29.840289 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxl28\" (UniqueName: \"kubernetes.io/projected/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-kube-api-access-wxl28\") pod \"placement-db-create-7mhht\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.896718 master-0 kubenswrapper[13984]: I0312 12:45:29.896074 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7mhht" Mar 12 12:45:29.940119 master-0 kubenswrapper[13984]: I0312 12:45:29.940036 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz69q\" (UniqueName: \"kubernetes.io/projected/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-kube-api-access-dz69q\") pod \"placement-1719-account-create-update-6wbfk\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:29.940320 master-0 kubenswrapper[13984]: I0312 12:45:29.940169 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-operator-scripts\") pod \"placement-1719-account-create-update-6wbfk\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:29.941500 master-0 kubenswrapper[13984]: I0312 12:45:29.941424 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-operator-scripts\") pod \"placement-1719-account-create-update-6wbfk\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:29.963800 master-0 kubenswrapper[13984]: I0312 12:45:29.963760 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz69q\" (UniqueName: \"kubernetes.io/projected/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-kube-api-access-dz69q\") pod \"placement-1719-account-create-update-6wbfk\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:30.222437 master-0 kubenswrapper[13984]: I0312 12:45:30.222327 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fhfwn"] Mar 12 12:45:30.247569 master-0 kubenswrapper[13984]: I0312 12:45:30.243452 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:30.247569 master-0 kubenswrapper[13984]: W0312 12:45:30.244240 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2635bca8_9408_45c1_b88c_3634684d244a.slice/crio-083d4f2e22a580f9d9c4e8363c91dd9d8d46e56e171a5514962a07e6cc3e14d2 WatchSource:0}: Error finding container 083d4f2e22a580f9d9c4e8363c91dd9d8d46e56e171a5514962a07e6cc3e14d2: Status 404 returned error can't find the container with id 083d4f2e22a580f9d9c4e8363c91dd9d8d46e56e171a5514962a07e6cc3e14d2 Mar 12 12:45:30.371347 master-0 kubenswrapper[13984]: I0312 12:45:30.369565 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-d62b-account-create-update-lr46l"] Mar 12 12:45:30.394072 master-0 kubenswrapper[13984]: W0312 12:45:30.393062 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6f71885_a3f0_4897_bc6c_bfd657a35108.slice/crio-c07c90c6fefa5f1f13392fa4a0c1b3bb46ed5e61f557b55fa1e1f9a6647c4508 WatchSource:0}: Error finding container c07c90c6fefa5f1f13392fa4a0c1b3bb46ed5e61f557b55fa1e1f9a6647c4508: Status 404 returned error can't find the container with id c07c90c6fefa5f1f13392fa4a0c1b3bb46ed5e61f557b55fa1e1f9a6647c4508 Mar 12 12:45:30.500051 master-0 kubenswrapper[13984]: W0312 12:45:30.500003 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13933f72_2f21_4c9d_8d80_c5fbd0e9f94f.slice/crio-91bdd45d27c8b2283e85f9a6a916287b62ef05e4e8b2710ecd81ba1bdac8cd8b WatchSource:0}: Error finding container 91bdd45d27c8b2283e85f9a6a916287b62ef05e4e8b2710ecd81ba1bdac8cd8b: Status 404 returned error can't find the container with id 91bdd45d27c8b2283e85f9a6a916287b62ef05e4e8b2710ecd81ba1bdac8cd8b Mar 12 12:45:30.509849 master-0 kubenswrapper[13984]: I0312 12:45:30.509802 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7mhht"] Mar 12 12:45:30.655855 master-0 kubenswrapper[13984]: I0312 12:45:30.655790 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:30.656073 master-0 kubenswrapper[13984]: E0312 12:45:30.656031 13984 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 12 12:45:30.656115 master-0 kubenswrapper[13984]: E0312 12:45:30.656095 13984 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 12 12:45:30.656218 master-0 kubenswrapper[13984]: E0312 12:45:30.656189 13984 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift podName:92e2764a-7fca-4b4a-ae89-131f181cdeb9 nodeName:}" failed. No retries permitted until 2026-03-12 12:45:38.65614276 +0000 UTC m=+1270.854158272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift") pod "swift-storage-0" (UID: "92e2764a-7fca-4b4a-ae89-131f181cdeb9") : configmap "swift-ring-files" not found Mar 12 12:45:30.725848 master-0 kubenswrapper[13984]: I0312 12:45:30.725681 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-1719-account-create-update-6wbfk"] Mar 12 12:45:30.778053 master-0 kubenswrapper[13984]: I0312 12:45:30.777952 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1719-account-create-update-6wbfk" event={"ID":"f9a1f634-3f7f-4470-838b-e675c4fb5c4a","Type":"ContainerStarted","Data":"cec4f229c2d0ccdfe8568e9b7f34cacaa7b594259805d8b8ad70907f316afd1c"} Mar 12 12:45:30.780127 master-0 kubenswrapper[13984]: I0312 12:45:30.780076 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d62b-account-create-update-lr46l" event={"ID":"b6f71885-a3f0-4897-bc6c-bfd657a35108","Type":"ContainerStarted","Data":"8d9c41861da3aec18944cc09814f2f91d309a70d1713a8ff2580c3316e86a782"} Mar 12 12:45:30.780222 master-0 kubenswrapper[13984]: I0312 12:45:30.780166 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d62b-account-create-update-lr46l" event={"ID":"b6f71885-a3f0-4897-bc6c-bfd657a35108","Type":"ContainerStarted","Data":"c07c90c6fefa5f1f13392fa4a0c1b3bb46ed5e61f557b55fa1e1f9a6647c4508"} Mar 12 12:45:30.781435 master-0 kubenswrapper[13984]: I0312 12:45:30.781399 13984 generic.go:334] "Generic (PLEG): container finished" podID="2635bca8-9408-45c1-b88c-3634684d244a" containerID="3b581960fdd5f0e39c4ffb1800356cb0ecf71e020d51c93b6bd30f4e2fa35ef0" exitCode=0 Mar 12 12:45:30.781563 master-0 kubenswrapper[13984]: I0312 12:45:30.781458 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fhfwn" event={"ID":"2635bca8-9408-45c1-b88c-3634684d244a","Type":"ContainerDied","Data":"3b581960fdd5f0e39c4ffb1800356cb0ecf71e020d51c93b6bd30f4e2fa35ef0"} Mar 12 12:45:30.781563 master-0 kubenswrapper[13984]: I0312 12:45:30.781501 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fhfwn" event={"ID":"2635bca8-9408-45c1-b88c-3634684d244a","Type":"ContainerStarted","Data":"083d4f2e22a580f9d9c4e8363c91dd9d8d46e56e171a5514962a07e6cc3e14d2"} Mar 12 12:45:30.788057 master-0 kubenswrapper[13984]: I0312 12:45:30.787992 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7mhht" event={"ID":"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f","Type":"ContainerStarted","Data":"349d95faffaa7d9e72b53de5864761f081d3ecf7f226b3d5032669e810f5a102"} Mar 12 12:45:30.788224 master-0 kubenswrapper[13984]: I0312 12:45:30.788128 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7mhht" event={"ID":"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f","Type":"ContainerStarted","Data":"91bdd45d27c8b2283e85f9a6a916287b62ef05e4e8b2710ecd81ba1bdac8cd8b"} Mar 12 12:45:30.858843 master-0 kubenswrapper[13984]: I0312 12:45:30.858763 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-d62b-account-create-update-lr46l" podStartSLOduration=1.858744503 podStartE2EDuration="1.858744503s" podCreationTimestamp="2026-03-12 12:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:30.854820071 +0000 UTC m=+1263.052835563" watchObservedRunningTime="2026-03-12 12:45:30.858744503 +0000 UTC m=+1263.056759985" Mar 12 12:45:30.888635 master-0 kubenswrapper[13984]: I0312 12:45:30.887780 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-7mhht" podStartSLOduration=1.887761472 podStartE2EDuration="1.887761472s" podCreationTimestamp="2026-03-12 12:45:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:30.887430274 +0000 UTC m=+1263.085445766" watchObservedRunningTime="2026-03-12 12:45:30.887761472 +0000 UTC m=+1263.085776964" Mar 12 12:45:31.019835 master-0 kubenswrapper[13984]: I0312 12:45:31.019769 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:45:31.109974 master-0 kubenswrapper[13984]: I0312 12:45:31.109321 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-b67fm"] Mar 12 12:45:31.109974 master-0 kubenswrapper[13984]: I0312 12:45:31.109629 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76f498f559-b67fm" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" containerName="dnsmasq-dns" containerID="cri-o://39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb" gracePeriod=10 Mar 12 12:45:31.244870 master-0 kubenswrapper[13984]: I0312 12:45:31.244834 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:31.270567 master-0 kubenswrapper[13984]: I0312 12:45:31.270136 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lmgh\" (UniqueName: \"kubernetes.io/projected/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-kube-api-access-8lmgh\") pod \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " Mar 12 12:45:31.270567 master-0 kubenswrapper[13984]: I0312 12:45:31.270205 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-operator-scripts\") pod \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\" (UID: \"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f\") " Mar 12 12:45:31.271524 master-0 kubenswrapper[13984]: I0312 12:45:31.271216 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" (UID: "a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:31.274655 master-0 kubenswrapper[13984]: I0312 12:45:31.274623 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-kube-api-access-8lmgh" (OuterVolumeSpecName: "kube-api-access-8lmgh") pod "a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" (UID: "a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f"). InnerVolumeSpecName "kube-api-access-8lmgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:31.377738 master-0 kubenswrapper[13984]: I0312 12:45:31.377683 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lmgh\" (UniqueName: \"kubernetes.io/projected/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-kube-api-access-8lmgh\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:31.377738 master-0 kubenswrapper[13984]: I0312 12:45:31.377729 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:31.790060 master-0 kubenswrapper[13984]: I0312 12:45:31.784680 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:31.790060 master-0 kubenswrapper[13984]: I0312 12:45:31.789289 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-config\") pod \"7f87409d-d6ed-4313-8daa-e92892584f21\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " Mar 12 12:45:31.790060 master-0 kubenswrapper[13984]: I0312 12:45:31.789391 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt7fr\" (UniqueName: \"kubernetes.io/projected/7f87409d-d6ed-4313-8daa-e92892584f21-kube-api-access-gt7fr\") pod \"7f87409d-d6ed-4313-8daa-e92892584f21\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " Mar 12 12:45:31.790060 master-0 kubenswrapper[13984]: I0312 12:45:31.789418 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-sb\") pod \"7f87409d-d6ed-4313-8daa-e92892584f21\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " Mar 12 12:45:31.790060 master-0 kubenswrapper[13984]: I0312 12:45:31.789448 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-nb\") pod \"7f87409d-d6ed-4313-8daa-e92892584f21\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " Mar 12 12:45:31.790060 master-0 kubenswrapper[13984]: I0312 12:45:31.789528 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-dns-svc\") pod \"7f87409d-d6ed-4313-8daa-e92892584f21\" (UID: \"7f87409d-d6ed-4313-8daa-e92892584f21\") " Mar 12 12:45:31.795177 master-0 kubenswrapper[13984]: I0312 12:45:31.795131 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f87409d-d6ed-4313-8daa-e92892584f21-kube-api-access-gt7fr" (OuterVolumeSpecName: "kube-api-access-gt7fr") pod "7f87409d-d6ed-4313-8daa-e92892584f21" (UID: "7f87409d-d6ed-4313-8daa-e92892584f21"). InnerVolumeSpecName "kube-api-access-gt7fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:31.803931 master-0 kubenswrapper[13984]: I0312 12:45:31.803880 13984 generic.go:334] "Generic (PLEG): container finished" podID="13933f72-2f21-4c9d-8d80-c5fbd0e9f94f" containerID="349d95faffaa7d9e72b53de5864761f081d3ecf7f226b3d5032669e810f5a102" exitCode=0 Mar 12 12:45:31.804119 master-0 kubenswrapper[13984]: I0312 12:45:31.803986 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7mhht" event={"ID":"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f","Type":"ContainerDied","Data":"349d95faffaa7d9e72b53de5864761f081d3ecf7f226b3d5032669e810f5a102"} Mar 12 12:45:31.807857 master-0 kubenswrapper[13984]: I0312 12:45:31.807023 13984 generic.go:334] "Generic (PLEG): container finished" podID="f9a1f634-3f7f-4470-838b-e675c4fb5c4a" containerID="d98ea8c5277a00743e9c051fef5f2e77249fcbd9eeaf709efe4a33f5e9324a86" exitCode=0 Mar 12 12:45:31.807857 master-0 kubenswrapper[13984]: I0312 12:45:31.807092 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1719-account-create-update-6wbfk" event={"ID":"f9a1f634-3f7f-4470-838b-e675c4fb5c4a","Type":"ContainerDied","Data":"d98ea8c5277a00743e9c051fef5f2e77249fcbd9eeaf709efe4a33f5e9324a86"} Mar 12 12:45:31.832137 master-0 kubenswrapper[13984]: I0312 12:45:31.825123 13984 generic.go:334] "Generic (PLEG): container finished" podID="b6f71885-a3f0-4897-bc6c-bfd657a35108" containerID="8d9c41861da3aec18944cc09814f2f91d309a70d1713a8ff2580c3316e86a782" exitCode=0 Mar 12 12:45:31.832137 master-0 kubenswrapper[13984]: I0312 12:45:31.825301 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d62b-account-create-update-lr46l" event={"ID":"b6f71885-a3f0-4897-bc6c-bfd657a35108","Type":"ContainerDied","Data":"8d9c41861da3aec18944cc09814f2f91d309a70d1713a8ff2580c3316e86a782"} Mar 12 12:45:31.864815 master-0 kubenswrapper[13984]: I0312 12:45:31.844085 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-wgcnp" event={"ID":"a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f","Type":"ContainerDied","Data":"58c28f554050dd2490a9775602397157929550d0a58383edc1b662eedd810dba"} Mar 12 12:45:31.864815 master-0 kubenswrapper[13984]: I0312 12:45:31.844174 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58c28f554050dd2490a9775602397157929550d0a58383edc1b662eedd810dba" Mar 12 12:45:31.864815 master-0 kubenswrapper[13984]: I0312 12:45:31.844257 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-wgcnp" Mar 12 12:45:31.874572 master-0 kubenswrapper[13984]: I0312 12:45:31.873986 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-config" (OuterVolumeSpecName: "config") pod "7f87409d-d6ed-4313-8daa-e92892584f21" (UID: "7f87409d-d6ed-4313-8daa-e92892584f21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:31.878096 master-0 kubenswrapper[13984]: I0312 12:45:31.878040 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f87409d-d6ed-4313-8daa-e92892584f21" (UID: "7f87409d-d6ed-4313-8daa-e92892584f21"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.889499 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f87409d-d6ed-4313-8daa-e92892584f21" (UID: "7f87409d-d6ed-4313-8daa-e92892584f21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.890796 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.890831 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt7fr\" (UniqueName: \"kubernetes.io/projected/7f87409d-d6ed-4313-8daa-e92892584f21-kube-api-access-gt7fr\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.890843 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.890852 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.895921 13984 generic.go:334] "Generic (PLEG): container finished" podID="7f87409d-d6ed-4313-8daa-e92892584f21" containerID="39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb" exitCode=0 Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.896095 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76f498f559-b67fm" Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.897335 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-b67fm" event={"ID":"7f87409d-d6ed-4313-8daa-e92892584f21","Type":"ContainerDied","Data":"39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb"} Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.897364 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76f498f559-b67fm" event={"ID":"7f87409d-d6ed-4313-8daa-e92892584f21","Type":"ContainerDied","Data":"b9eea651e3dfeb30fa9261fdb5966a86ee9b84562fdc08640a0f96b03129ac42"} Mar 12 12:45:31.897508 master-0 kubenswrapper[13984]: I0312 12:45:31.897386 13984 scope.go:117] "RemoveContainer" containerID="39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb" Mar 12 12:45:31.909521 master-0 kubenswrapper[13984]: I0312 12:45:31.909455 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f87409d-d6ed-4313-8daa-e92892584f21" (UID: "7f87409d-d6ed-4313-8daa-e92892584f21"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:31.919789 master-0 kubenswrapper[13984]: I0312 12:45:31.919752 13984 scope.go:117] "RemoveContainer" containerID="8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215" Mar 12 12:45:31.944678 master-0 kubenswrapper[13984]: I0312 12:45:31.944640 13984 scope.go:117] "RemoveContainer" containerID="39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb" Mar 12 12:45:31.945012 master-0 kubenswrapper[13984]: E0312 12:45:31.944976 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb\": container with ID starting with 39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb not found: ID does not exist" containerID="39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb" Mar 12 12:45:31.945074 master-0 kubenswrapper[13984]: I0312 12:45:31.945006 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb"} err="failed to get container status \"39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb\": rpc error: code = NotFound desc = could not find container \"39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb\": container with ID starting with 39197adf323de98ca5235021c8e2e74d57b6fed56027c962c3d1722ddf254cdb not found: ID does not exist" Mar 12 12:45:31.945074 master-0 kubenswrapper[13984]: I0312 12:45:31.945027 13984 scope.go:117] "RemoveContainer" containerID="8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215" Mar 12 12:45:31.945217 master-0 kubenswrapper[13984]: E0312 12:45:31.945194 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215\": container with ID starting with 8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215 not found: ID does not exist" containerID="8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215" Mar 12 12:45:31.945273 master-0 kubenswrapper[13984]: I0312 12:45:31.945214 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215"} err="failed to get container status \"8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215\": rpc error: code = NotFound desc = could not find container \"8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215\": container with ID starting with 8115710294e5b4218026e00ae4f9d5c0e39599e845da87a5f04a4689ff26c215 not found: ID does not exist" Mar 12 12:45:31.994177 master-0 kubenswrapper[13984]: I0312 12:45:31.993874 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f87409d-d6ed-4313-8daa-e92892584f21-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:32.228724 master-0 kubenswrapper[13984]: I0312 12:45:32.228669 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-b67fm"] Mar 12 12:45:32.237230 master-0 kubenswrapper[13984]: I0312 12:45:32.237181 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76f498f559-b67fm"] Mar 12 12:45:32.328451 master-0 kubenswrapper[13984]: I0312 12:45:32.328418 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:32.400513 master-0 kubenswrapper[13984]: I0312 12:45:32.400456 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2635bca8-9408-45c1-b88c-3634684d244a-operator-scripts\") pod \"2635bca8-9408-45c1-b88c-3634684d244a\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " Mar 12 12:45:32.401029 master-0 kubenswrapper[13984]: I0312 12:45:32.400982 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2635bca8-9408-45c1-b88c-3634684d244a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2635bca8-9408-45c1-b88c-3634684d244a" (UID: "2635bca8-9408-45c1-b88c-3634684d244a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:32.401170 master-0 kubenswrapper[13984]: I0312 12:45:32.401145 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwskc\" (UniqueName: \"kubernetes.io/projected/2635bca8-9408-45c1-b88c-3634684d244a-kube-api-access-lwskc\") pod \"2635bca8-9408-45c1-b88c-3634684d244a\" (UID: \"2635bca8-9408-45c1-b88c-3634684d244a\") " Mar 12 12:45:32.402315 master-0 kubenswrapper[13984]: I0312 12:45:32.402298 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2635bca8-9408-45c1-b88c-3634684d244a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:32.412824 master-0 kubenswrapper[13984]: I0312 12:45:32.412778 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2635bca8-9408-45c1-b88c-3634684d244a-kube-api-access-lwskc" (OuterVolumeSpecName: "kube-api-access-lwskc") pod "2635bca8-9408-45c1-b88c-3634684d244a" (UID: "2635bca8-9408-45c1-b88c-3634684d244a"). InnerVolumeSpecName "kube-api-access-lwskc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:32.503379 master-0 kubenswrapper[13984]: I0312 12:45:32.503247 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwskc\" (UniqueName: \"kubernetes.io/projected/2635bca8-9408-45c1-b88c-3634684d244a-kube-api-access-lwskc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:32.914230 master-0 kubenswrapper[13984]: I0312 12:45:32.914079 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fhfwn" event={"ID":"2635bca8-9408-45c1-b88c-3634684d244a","Type":"ContainerDied","Data":"083d4f2e22a580f9d9c4e8363c91dd9d8d46e56e171a5514962a07e6cc3e14d2"} Mar 12 12:45:32.914230 master-0 kubenswrapper[13984]: I0312 12:45:32.914145 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="083d4f2e22a580f9d9c4e8363c91dd9d8d46e56e171a5514962a07e6cc3e14d2" Mar 12 12:45:32.914471 master-0 kubenswrapper[13984]: I0312 12:45:32.914338 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fhfwn" Mar 12 12:45:33.477824 master-0 kubenswrapper[13984]: I0312 12:45:33.477037 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618090 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hncbq"] Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: E0312 12:45:33.618509 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" containerName="init" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618522 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" containerName="init" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: E0312 12:45:33.618540 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2635bca8-9408-45c1-b88c-3634684d244a" containerName="mariadb-database-create" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618546 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="2635bca8-9408-45c1-b88c-3634684d244a" containerName="mariadb-database-create" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: E0312 12:45:33.618564 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" containerName="dnsmasq-dns" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618570 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" containerName="dnsmasq-dns" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: E0312 12:45:33.618593 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a1f634-3f7f-4470-838b-e675c4fb5c4a" containerName="mariadb-account-create-update" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618599 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a1f634-3f7f-4470-838b-e675c4fb5c4a" containerName="mariadb-account-create-update" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: E0312 12:45:33.618620 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" containerName="mariadb-account-create-update" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618626 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" containerName="mariadb-account-create-update" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618795 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" containerName="dnsmasq-dns" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618809 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a1f634-3f7f-4470-838b-e675c4fb5c4a" containerName="mariadb-account-create-update" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618823 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" containerName="mariadb-account-create-update" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.618860 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="2635bca8-9408-45c1-b88c-3634684d244a" containerName="mariadb-database-create" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.619456 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.623983 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-operator-scripts\") pod \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " Mar 12 12:45:33.624518 master-0 kubenswrapper[13984]: I0312 12:45:33.624030 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz69q\" (UniqueName: \"kubernetes.io/projected/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-kube-api-access-dz69q\") pod \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\" (UID: \"f9a1f634-3f7f-4470-838b-e675c4fb5c4a\") " Mar 12 12:45:33.625758 master-0 kubenswrapper[13984]: I0312 12:45:33.624972 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f9a1f634-3f7f-4470-838b-e675c4fb5c4a" (UID: "f9a1f634-3f7f-4470-838b-e675c4fb5c4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:33.653957 master-0 kubenswrapper[13984]: I0312 12:45:33.653730 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-kube-api-access-dz69q" (OuterVolumeSpecName: "kube-api-access-dz69q") pod "f9a1f634-3f7f-4470-838b-e675c4fb5c4a" (UID: "f9a1f634-3f7f-4470-838b-e675c4fb5c4a"). InnerVolumeSpecName "kube-api-access-dz69q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:33.693553 master-0 kubenswrapper[13984]: I0312 12:45:33.692335 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hncbq"] Mar 12 12:45:33.726870 master-0 kubenswrapper[13984]: I0312 12:45:33.726607 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5mt\" (UniqueName: \"kubernetes.io/projected/68a41ec0-ae93-43bf-97a1-acda5b50ee55-kube-api-access-2h5mt\") pod \"glance-db-create-hncbq\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.726870 master-0 kubenswrapper[13984]: I0312 12:45:33.726746 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a41ec0-ae93-43bf-97a1-acda5b50ee55-operator-scripts\") pod \"glance-db-create-hncbq\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.726870 master-0 kubenswrapper[13984]: I0312 12:45:33.726865 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:33.726870 master-0 kubenswrapper[13984]: I0312 12:45:33.726878 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz69q\" (UniqueName: \"kubernetes.io/projected/f9a1f634-3f7f-4470-838b-e675c4fb5c4a-kube-api-access-dz69q\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:33.728303 master-0 kubenswrapper[13984]: I0312 12:45:33.728284 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-2374-account-create-update-thf95"] Mar 12 12:45:33.729901 master-0 kubenswrapper[13984]: I0312 12:45:33.729877 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:33.731772 master-0 kubenswrapper[13984]: I0312 12:45:33.731692 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 12 12:45:33.738244 master-0 kubenswrapper[13984]: I0312 12:45:33.738208 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2374-account-create-update-thf95"] Mar 12 12:45:33.758359 master-0 kubenswrapper[13984]: I0312 12:45:33.758311 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7mhht" Mar 12 12:45:33.775501 master-0 kubenswrapper[13984]: I0312 12:45:33.768878 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:33.830711 master-0 kubenswrapper[13984]: I0312 12:45:33.828670 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a41ec0-ae93-43bf-97a1-acda5b50ee55-operator-scripts\") pod \"glance-db-create-hncbq\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.830711 master-0 kubenswrapper[13984]: I0312 12:45:33.828841 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5mt\" (UniqueName: \"kubernetes.io/projected/68a41ec0-ae93-43bf-97a1-acda5b50ee55-kube-api-access-2h5mt\") pod \"glance-db-create-hncbq\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.830711 master-0 kubenswrapper[13984]: I0312 12:45:33.830222 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a41ec0-ae93-43bf-97a1-acda5b50ee55-operator-scripts\") pod \"glance-db-create-hncbq\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.855747 master-0 kubenswrapper[13984]: I0312 12:45:33.855694 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5mt\" (UniqueName: \"kubernetes.io/projected/68a41ec0-ae93-43bf-97a1-acda5b50ee55-kube-api-access-2h5mt\") pod \"glance-db-create-hncbq\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " pod="openstack/glance-db-create-hncbq" Mar 12 12:45:33.929813 master-0 kubenswrapper[13984]: I0312 12:45:33.929759 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-d62b-account-create-update-lr46l" event={"ID":"b6f71885-a3f0-4897-bc6c-bfd657a35108","Type":"ContainerDied","Data":"c07c90c6fefa5f1f13392fa4a0c1b3bb46ed5e61f557b55fa1e1f9a6647c4508"} Mar 12 12:45:33.930029 master-0 kubenswrapper[13984]: I0312 12:45:33.929841 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f71885-a3f0-4897-bc6c-bfd657a35108-operator-scripts\") pod \"b6f71885-a3f0-4897-bc6c-bfd657a35108\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " Mar 12 12:45:33.930029 master-0 kubenswrapper[13984]: I0312 12:45:33.929936 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxl28\" (UniqueName: \"kubernetes.io/projected/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-kube-api-access-wxl28\") pod \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " Mar 12 12:45:33.930101 master-0 kubenswrapper[13984]: I0312 12:45:33.930030 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgk9n\" (UniqueName: \"kubernetes.io/projected/b6f71885-a3f0-4897-bc6c-bfd657a35108-kube-api-access-kgk9n\") pod \"b6f71885-a3f0-4897-bc6c-bfd657a35108\" (UID: \"b6f71885-a3f0-4897-bc6c-bfd657a35108\") " Mar 12 12:45:33.930101 master-0 kubenswrapper[13984]: I0312 12:45:33.930068 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-operator-scripts\") pod \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\" (UID: \"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f\") " Mar 12 12:45:33.930387 master-0 kubenswrapper[13984]: I0312 12:45:33.930360 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4x6r6\" (UniqueName: \"kubernetes.io/projected/b671c7b6-86a7-4134-a072-70b23814f541-kube-api-access-4x6r6\") pod \"glance-2374-account-create-update-thf95\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:33.930433 master-0 kubenswrapper[13984]: I0312 12:45:33.930417 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b671c7b6-86a7-4134-a072-70b23814f541-operator-scripts\") pod \"glance-2374-account-create-update-thf95\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:33.930684 master-0 kubenswrapper[13984]: I0312 12:45:33.930608 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-d62b-account-create-update-lr46l" Mar 12 12:45:33.930984 master-0 kubenswrapper[13984]: I0312 12:45:33.930956 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6f71885-a3f0-4897-bc6c-bfd657a35108-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b6f71885-a3f0-4897-bc6c-bfd657a35108" (UID: "b6f71885-a3f0-4897-bc6c-bfd657a35108"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:33.930984 master-0 kubenswrapper[13984]: I0312 12:45:33.929858 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c07c90c6fefa5f1f13392fa4a0c1b3bb46ed5e61f557b55fa1e1f9a6647c4508" Mar 12 12:45:33.931606 master-0 kubenswrapper[13984]: I0312 12:45:33.931568 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "13933f72-2f21-4c9d-8d80-c5fbd0e9f94f" (UID: "13933f72-2f21-4c9d-8d80-c5fbd0e9f94f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:33.933710 master-0 kubenswrapper[13984]: I0312 12:45:33.933355 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7mhht" event={"ID":"13933f72-2f21-4c9d-8d80-c5fbd0e9f94f","Type":"ContainerDied","Data":"91bdd45d27c8b2283e85f9a6a916287b62ef05e4e8b2710ecd81ba1bdac8cd8b"} Mar 12 12:45:33.933710 master-0 kubenswrapper[13984]: I0312 12:45:33.933388 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91bdd45d27c8b2283e85f9a6a916287b62ef05e4e8b2710ecd81ba1bdac8cd8b" Mar 12 12:45:33.933710 master-0 kubenswrapper[13984]: I0312 12:45:33.933414 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7mhht" Mar 12 12:45:33.935548 master-0 kubenswrapper[13984]: I0312 12:45:33.935517 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-1719-account-create-update-6wbfk" event={"ID":"f9a1f634-3f7f-4470-838b-e675c4fb5c4a","Type":"ContainerDied","Data":"cec4f229c2d0ccdfe8568e9b7f34cacaa7b594259805d8b8ad70907f316afd1c"} Mar 12 12:45:33.935548 master-0 kubenswrapper[13984]: I0312 12:45:33.935544 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cec4f229c2d0ccdfe8568e9b7f34cacaa7b594259805d8b8ad70907f316afd1c" Mar 12 12:45:33.935633 master-0 kubenswrapper[13984]: I0312 12:45:33.935561 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-kube-api-access-wxl28" (OuterVolumeSpecName: "kube-api-access-wxl28") pod "13933f72-2f21-4c9d-8d80-c5fbd0e9f94f" (UID: "13933f72-2f21-4c9d-8d80-c5fbd0e9f94f"). InnerVolumeSpecName "kube-api-access-wxl28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:33.935633 master-0 kubenswrapper[13984]: I0312 12:45:33.935570 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-1719-account-create-update-6wbfk" Mar 12 12:45:33.935688 master-0 kubenswrapper[13984]: I0312 12:45:33.935655 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f71885-a3f0-4897-bc6c-bfd657a35108-kube-api-access-kgk9n" (OuterVolumeSpecName: "kube-api-access-kgk9n") pod "b6f71885-a3f0-4897-bc6c-bfd657a35108" (UID: "b6f71885-a3f0-4897-bc6c-bfd657a35108"). InnerVolumeSpecName "kube-api-access-kgk9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:33.996522 master-0 kubenswrapper[13984]: I0312 12:45:33.996435 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f87409d-d6ed-4313-8daa-e92892584f21" path="/var/lib/kubelet/pods/7f87409d-d6ed-4313-8daa-e92892584f21/volumes" Mar 12 12:45:34.032718 master-0 kubenswrapper[13984]: I0312 12:45:34.032659 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4x6r6\" (UniqueName: \"kubernetes.io/projected/b671c7b6-86a7-4134-a072-70b23814f541-kube-api-access-4x6r6\") pod \"glance-2374-account-create-update-thf95\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:34.032873 master-0 kubenswrapper[13984]: I0312 12:45:34.032757 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b671c7b6-86a7-4134-a072-70b23814f541-operator-scripts\") pod \"glance-2374-account-create-update-thf95\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:34.032942 master-0 kubenswrapper[13984]: I0312 12:45:34.032909 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgk9n\" (UniqueName: \"kubernetes.io/projected/b6f71885-a3f0-4897-bc6c-bfd657a35108-kube-api-access-kgk9n\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:34.032942 master-0 kubenswrapper[13984]: I0312 12:45:34.032927 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:34.032942 master-0 kubenswrapper[13984]: I0312 12:45:34.032940 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b6f71885-a3f0-4897-bc6c-bfd657a35108-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:34.033069 master-0 kubenswrapper[13984]: I0312 12:45:34.032953 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxl28\" (UniqueName: \"kubernetes.io/projected/13933f72-2f21-4c9d-8d80-c5fbd0e9f94f-kube-api-access-wxl28\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:34.033813 master-0 kubenswrapper[13984]: I0312 12:45:34.033775 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b671c7b6-86a7-4134-a072-70b23814f541-operator-scripts\") pod \"glance-2374-account-create-update-thf95\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:34.051973 master-0 kubenswrapper[13984]: I0312 12:45:34.051936 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hncbq" Mar 12 12:45:34.052319 master-0 kubenswrapper[13984]: I0312 12:45:34.052269 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4x6r6\" (UniqueName: \"kubernetes.io/projected/b671c7b6-86a7-4134-a072-70b23814f541-kube-api-access-4x6r6\") pod \"glance-2374-account-create-update-thf95\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:34.086752 master-0 kubenswrapper[13984]: I0312 12:45:34.086596 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:34.549224 master-0 kubenswrapper[13984]: I0312 12:45:34.547579 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hncbq"] Mar 12 12:45:34.665180 master-0 kubenswrapper[13984]: W0312 12:45:34.665132 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb671c7b6_86a7_4134_a072_70b23814f541.slice/crio-9e9fee75a0ee4fd75d91c25efab5c60863f05972911723895c4bb4915ad7086b WatchSource:0}: Error finding container 9e9fee75a0ee4fd75d91c25efab5c60863f05972911723895c4bb4915ad7086b: Status 404 returned error can't find the container with id 9e9fee75a0ee4fd75d91c25efab5c60863f05972911723895c4bb4915ad7086b Mar 12 12:45:34.678065 master-0 kubenswrapper[13984]: I0312 12:45:34.677996 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-2374-account-create-update-thf95"] Mar 12 12:45:34.950121 master-0 kubenswrapper[13984]: I0312 12:45:34.950063 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hncbq" event={"ID":"68a41ec0-ae93-43bf-97a1-acda5b50ee55","Type":"ContainerStarted","Data":"3b6d6bc91582685abd656d4c376b368c51ad784c504d673282b8f47da5e5f242"} Mar 12 12:45:34.950121 master-0 kubenswrapper[13984]: I0312 12:45:34.950123 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hncbq" event={"ID":"68a41ec0-ae93-43bf-97a1-acda5b50ee55","Type":"ContainerStarted","Data":"8d40aca1aa0706d56befc1173a24d7af869db0af0832355b6cf7738c35473fd8"} Mar 12 12:45:34.962507 master-0 kubenswrapper[13984]: I0312 12:45:34.956807 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2374-account-create-update-thf95" event={"ID":"b671c7b6-86a7-4134-a072-70b23814f541","Type":"ContainerStarted","Data":"8b731f8fc89c92ef27c91b55c8807d923f58069d7d1474b832c70504a169c7ca"} Mar 12 12:45:34.962507 master-0 kubenswrapper[13984]: I0312 12:45:34.956890 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2374-account-create-update-thf95" event={"ID":"b671c7b6-86a7-4134-a072-70b23814f541","Type":"ContainerStarted","Data":"9e9fee75a0ee4fd75d91c25efab5c60863f05972911723895c4bb4915ad7086b"} Mar 12 12:45:34.983739 master-0 kubenswrapper[13984]: I0312 12:45:34.978396 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-hncbq" podStartSLOduration=1.9783745910000001 podStartE2EDuration="1.978374591s" podCreationTimestamp="2026-03-12 12:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:34.968570561 +0000 UTC m=+1267.166586063" watchObservedRunningTime="2026-03-12 12:45:34.978374591 +0000 UTC m=+1267.176390083" Mar 12 12:45:35.028617 master-0 kubenswrapper[13984]: I0312 12:45:35.022541 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-2374-account-create-update-thf95" podStartSLOduration=2.022522774 podStartE2EDuration="2.022522774s" podCreationTimestamp="2026-03-12 12:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:35.013916703 +0000 UTC m=+1267.211932195" watchObservedRunningTime="2026-03-12 12:45:35.022522774 +0000 UTC m=+1267.220538266" Mar 12 12:45:35.966968 master-0 kubenswrapper[13984]: I0312 12:45:35.966915 13984 generic.go:334] "Generic (PLEG): container finished" podID="b671c7b6-86a7-4134-a072-70b23814f541" containerID="8b731f8fc89c92ef27c91b55c8807d923f58069d7d1474b832c70504a169c7ca" exitCode=0 Mar 12 12:45:35.967503 master-0 kubenswrapper[13984]: I0312 12:45:35.967025 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2374-account-create-update-thf95" event={"ID":"b671c7b6-86a7-4134-a072-70b23814f541","Type":"ContainerDied","Data":"8b731f8fc89c92ef27c91b55c8807d923f58069d7d1474b832c70504a169c7ca"} Mar 12 12:45:35.969234 master-0 kubenswrapper[13984]: I0312 12:45:35.969200 13984 generic.go:334] "Generic (PLEG): container finished" podID="68a41ec0-ae93-43bf-97a1-acda5b50ee55" containerID="3b6d6bc91582685abd656d4c376b368c51ad784c504d673282b8f47da5e5f242" exitCode=0 Mar 12 12:45:35.969333 master-0 kubenswrapper[13984]: I0312 12:45:35.969266 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hncbq" event={"ID":"68a41ec0-ae93-43bf-97a1-acda5b50ee55","Type":"ContainerDied","Data":"3b6d6bc91582685abd656d4c376b368c51ad784c504d673282b8f47da5e5f242"} Mar 12 12:45:36.102054 master-0 kubenswrapper[13984]: I0312 12:45:36.101966 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-wgcnp"] Mar 12 12:45:36.112231 master-0 kubenswrapper[13984]: I0312 12:45:36.112155 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-wgcnp"] Mar 12 12:45:36.191710 master-0 kubenswrapper[13984]: I0312 12:45:36.191641 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-vcdjd"] Mar 12 12:45:36.192157 master-0 kubenswrapper[13984]: E0312 12:45:36.192125 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f71885-a3f0-4897-bc6c-bfd657a35108" containerName="mariadb-account-create-update" Mar 12 12:45:36.192157 master-0 kubenswrapper[13984]: I0312 12:45:36.192153 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f71885-a3f0-4897-bc6c-bfd657a35108" containerName="mariadb-account-create-update" Mar 12 12:45:36.192262 master-0 kubenswrapper[13984]: E0312 12:45:36.192195 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13933f72-2f21-4c9d-8d80-c5fbd0e9f94f" containerName="mariadb-database-create" Mar 12 12:45:36.192262 master-0 kubenswrapper[13984]: I0312 12:45:36.192205 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="13933f72-2f21-4c9d-8d80-c5fbd0e9f94f" containerName="mariadb-database-create" Mar 12 12:45:36.192506 master-0 kubenswrapper[13984]: I0312 12:45:36.192466 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f71885-a3f0-4897-bc6c-bfd657a35108" containerName="mariadb-account-create-update" Mar 12 12:45:36.192591 master-0 kubenswrapper[13984]: I0312 12:45:36.192527 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="13933f72-2f21-4c9d-8d80-c5fbd0e9f94f" containerName="mariadb-database-create" Mar 12 12:45:36.193546 master-0 kubenswrapper[13984]: I0312 12:45:36.193472 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.197553 master-0 kubenswrapper[13984]: I0312 12:45:36.197417 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 12 12:45:36.202011 master-0 kubenswrapper[13984]: I0312 12:45:36.201970 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vcdjd"] Mar 12 12:45:36.216721 master-0 kubenswrapper[13984]: I0312 12:45:36.216634 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkpkm\" (UniqueName: \"kubernetes.io/projected/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-kube-api-access-vkpkm\") pod \"root-account-create-update-vcdjd\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.216721 master-0 kubenswrapper[13984]: I0312 12:45:36.216724 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-operator-scripts\") pod \"root-account-create-update-vcdjd\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.318748 master-0 kubenswrapper[13984]: I0312 12:45:36.318314 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkpkm\" (UniqueName: \"kubernetes.io/projected/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-kube-api-access-vkpkm\") pod \"root-account-create-update-vcdjd\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.318748 master-0 kubenswrapper[13984]: I0312 12:45:36.318497 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-operator-scripts\") pod \"root-account-create-update-vcdjd\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.319537 master-0 kubenswrapper[13984]: I0312 12:45:36.319440 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-operator-scripts\") pod \"root-account-create-update-vcdjd\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.335007 master-0 kubenswrapper[13984]: I0312 12:45:36.334706 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkpkm\" (UniqueName: \"kubernetes.io/projected/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-kube-api-access-vkpkm\") pod \"root-account-create-update-vcdjd\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.518701 master-0 kubenswrapper[13984]: I0312 12:45:36.518586 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:36.986184 master-0 kubenswrapper[13984]: I0312 12:45:36.986099 13984 generic.go:334] "Generic (PLEG): container finished" podID="5767f9cc-96f2-4309-a6da-e89247924459" containerID="b9354b78aa5b677786c949a06dfde921f1036d738cc9b853302ab77cc909dde4" exitCode=0 Mar 12 12:45:36.986938 master-0 kubenswrapper[13984]: I0312 12:45:36.986184 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2mjxx" event={"ID":"5767f9cc-96f2-4309-a6da-e89247924459","Type":"ContainerDied","Data":"b9354b78aa5b677786c949a06dfde921f1036d738cc9b853302ab77cc909dde4"} Mar 12 12:45:37.069572 master-0 kubenswrapper[13984]: I0312 12:45:37.069509 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-vcdjd"] Mar 12 12:45:37.652340 master-0 kubenswrapper[13984]: I0312 12:45:37.652262 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:37.665726 master-0 kubenswrapper[13984]: I0312 12:45:37.665664 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hncbq" Mar 12 12:45:37.765491 master-0 kubenswrapper[13984]: I0312 12:45:37.765432 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a41ec0-ae93-43bf-97a1-acda5b50ee55-operator-scripts\") pod \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " Mar 12 12:45:37.765997 master-0 kubenswrapper[13984]: I0312 12:45:37.765537 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5mt\" (UniqueName: \"kubernetes.io/projected/68a41ec0-ae93-43bf-97a1-acda5b50ee55-kube-api-access-2h5mt\") pod \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\" (UID: \"68a41ec0-ae93-43bf-97a1-acda5b50ee55\") " Mar 12 12:45:37.765997 master-0 kubenswrapper[13984]: I0312 12:45:37.765668 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4x6r6\" (UniqueName: \"kubernetes.io/projected/b671c7b6-86a7-4134-a072-70b23814f541-kube-api-access-4x6r6\") pod \"b671c7b6-86a7-4134-a072-70b23814f541\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " Mar 12 12:45:37.765997 master-0 kubenswrapper[13984]: I0312 12:45:37.765716 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b671c7b6-86a7-4134-a072-70b23814f541-operator-scripts\") pod \"b671c7b6-86a7-4134-a072-70b23814f541\" (UID: \"b671c7b6-86a7-4134-a072-70b23814f541\") " Mar 12 12:45:37.766131 master-0 kubenswrapper[13984]: I0312 12:45:37.766047 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/68a41ec0-ae93-43bf-97a1-acda5b50ee55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "68a41ec0-ae93-43bf-97a1-acda5b50ee55" (UID: "68a41ec0-ae93-43bf-97a1-acda5b50ee55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:37.766235 master-0 kubenswrapper[13984]: I0312 12:45:37.766201 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/68a41ec0-ae93-43bf-97a1-acda5b50ee55-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:37.766460 master-0 kubenswrapper[13984]: I0312 12:45:37.766427 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b671c7b6-86a7-4134-a072-70b23814f541-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b671c7b6-86a7-4134-a072-70b23814f541" (UID: "b671c7b6-86a7-4134-a072-70b23814f541"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:37.769519 master-0 kubenswrapper[13984]: I0312 12:45:37.769459 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b671c7b6-86a7-4134-a072-70b23814f541-kube-api-access-4x6r6" (OuterVolumeSpecName: "kube-api-access-4x6r6") pod "b671c7b6-86a7-4134-a072-70b23814f541" (UID: "b671c7b6-86a7-4134-a072-70b23814f541"). InnerVolumeSpecName "kube-api-access-4x6r6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:37.769769 master-0 kubenswrapper[13984]: I0312 12:45:37.769721 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a41ec0-ae93-43bf-97a1-acda5b50ee55-kube-api-access-2h5mt" (OuterVolumeSpecName: "kube-api-access-2h5mt") pod "68a41ec0-ae93-43bf-97a1-acda5b50ee55" (UID: "68a41ec0-ae93-43bf-97a1-acda5b50ee55"). InnerVolumeSpecName "kube-api-access-2h5mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:37.868106 master-0 kubenswrapper[13984]: I0312 12:45:37.867774 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4x6r6\" (UniqueName: \"kubernetes.io/projected/b671c7b6-86a7-4134-a072-70b23814f541-kube-api-access-4x6r6\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:37.868106 master-0 kubenswrapper[13984]: I0312 12:45:37.867817 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b671c7b6-86a7-4134-a072-70b23814f541-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:37.868106 master-0 kubenswrapper[13984]: I0312 12:45:37.867827 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5mt\" (UniqueName: \"kubernetes.io/projected/68a41ec0-ae93-43bf-97a1-acda5b50ee55-kube-api-access-2h5mt\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:37.999163 master-0 kubenswrapper[13984]: I0312 12:45:37.998187 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f" path="/var/lib/kubelet/pods/a1441db1-ea2c-4e0b-bf86-9ec2b1c4df8f/volumes" Mar 12 12:45:38.013044 master-0 kubenswrapper[13984]: I0312 12:45:38.008988 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-2374-account-create-update-thf95" Mar 12 12:45:38.013044 master-0 kubenswrapper[13984]: I0312 12:45:38.009138 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-2374-account-create-update-thf95" event={"ID":"b671c7b6-86a7-4134-a072-70b23814f541","Type":"ContainerDied","Data":"9e9fee75a0ee4fd75d91c25efab5c60863f05972911723895c4bb4915ad7086b"} Mar 12 12:45:38.013044 master-0 kubenswrapper[13984]: I0312 12:45:38.009310 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e9fee75a0ee4fd75d91c25efab5c60863f05972911723895c4bb4915ad7086b" Mar 12 12:45:38.016709 master-0 kubenswrapper[13984]: I0312 12:45:38.014394 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hncbq" Mar 12 12:45:38.016709 master-0 kubenswrapper[13984]: I0312 12:45:38.014559 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hncbq" event={"ID":"68a41ec0-ae93-43bf-97a1-acda5b50ee55","Type":"ContainerDied","Data":"8d40aca1aa0706d56befc1173a24d7af869db0af0832355b6cf7738c35473fd8"} Mar 12 12:45:38.016709 master-0 kubenswrapper[13984]: I0312 12:45:38.014654 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d40aca1aa0706d56befc1173a24d7af869db0af0832355b6cf7738c35473fd8" Mar 12 12:45:38.016982 master-0 kubenswrapper[13984]: I0312 12:45:38.016817 13984 generic.go:334] "Generic (PLEG): container finished" podID="8fc04a62-dfbf-401b-88ba-42dcf4acfaac" containerID="a08a73fd0b5adf5d7fcfdc9f5c85ee2ac38ea08cabf986b293f1d5d347547be8" exitCode=0 Mar 12 12:45:38.017143 master-0 kubenswrapper[13984]: I0312 12:45:38.017075 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vcdjd" event={"ID":"8fc04a62-dfbf-401b-88ba-42dcf4acfaac","Type":"ContainerDied","Data":"a08a73fd0b5adf5d7fcfdc9f5c85ee2ac38ea08cabf986b293f1d5d347547be8"} Mar 12 12:45:38.017143 master-0 kubenswrapper[13984]: I0312 12:45:38.017126 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vcdjd" event={"ID":"8fc04a62-dfbf-401b-88ba-42dcf4acfaac","Type":"ContainerStarted","Data":"0118051cb97ce8103037e742ca0b43f7a290573ece27483fb5135d9088778f09"} Mar 12 12:45:38.498582 master-0 kubenswrapper[13984]: I0312 12:45:38.497903 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:38.597840 master-0 kubenswrapper[13984]: I0312 12:45:38.597772 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-dispersionconf\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.598078 master-0 kubenswrapper[13984]: I0312 12:45:38.597981 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5767f9cc-96f2-4309-a6da-e89247924459-etc-swift\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.598078 master-0 kubenswrapper[13984]: I0312 12:45:38.598033 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9k8q\" (UniqueName: \"kubernetes.io/projected/5767f9cc-96f2-4309-a6da-e89247924459-kube-api-access-d9k8q\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.598179 master-0 kubenswrapper[13984]: I0312 12:45:38.598094 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-swiftconf\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.598179 master-0 kubenswrapper[13984]: I0312 12:45:38.598131 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-combined-ca-bundle\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.598266 master-0 kubenswrapper[13984]: I0312 12:45:38.598241 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-scripts\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.598327 master-0 kubenswrapper[13984]: I0312 12:45:38.598310 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-ring-data-devices\") pod \"5767f9cc-96f2-4309-a6da-e89247924459\" (UID: \"5767f9cc-96f2-4309-a6da-e89247924459\") " Mar 12 12:45:38.599576 master-0 kubenswrapper[13984]: I0312 12:45:38.599288 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:38.599576 master-0 kubenswrapper[13984]: I0312 12:45:38.599543 13984 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.600711 master-0 kubenswrapper[13984]: I0312 12:45:38.600656 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5767f9cc-96f2-4309-a6da-e89247924459-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:45:38.602359 master-0 kubenswrapper[13984]: I0312 12:45:38.602300 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5767f9cc-96f2-4309-a6da-e89247924459-kube-api-access-d9k8q" (OuterVolumeSpecName: "kube-api-access-d9k8q") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "kube-api-access-d9k8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:38.603857 master-0 kubenswrapper[13984]: I0312 12:45:38.603790 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:45:38.619497 master-0 kubenswrapper[13984]: I0312 12:45:38.619418 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-scripts" (OuterVolumeSpecName: "scripts") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:38.621443 master-0 kubenswrapper[13984]: I0312 12:45:38.621403 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:45:38.621561 master-0 kubenswrapper[13984]: I0312 12:45:38.621436 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5767f9cc-96f2-4309-a6da-e89247924459" (UID: "5767f9cc-96f2-4309-a6da-e89247924459"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:45:38.701560 master-0 kubenswrapper[13984]: I0312 12:45:38.701492 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:38.702025 master-0 kubenswrapper[13984]: I0312 12:45:38.701979 13984 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.702154 master-0 kubenswrapper[13984]: I0312 12:45:38.702124 13984 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5767f9cc-96f2-4309-a6da-e89247924459-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.702202 master-0 kubenswrapper[13984]: I0312 12:45:38.702155 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9k8q\" (UniqueName: \"kubernetes.io/projected/5767f9cc-96f2-4309-a6da-e89247924459-kube-api-access-d9k8q\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.702202 master-0 kubenswrapper[13984]: I0312 12:45:38.702168 13984 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.702202 master-0 kubenswrapper[13984]: I0312 12:45:38.702180 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5767f9cc-96f2-4309-a6da-e89247924459-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.702202 master-0 kubenswrapper[13984]: I0312 12:45:38.702191 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5767f9cc-96f2-4309-a6da-e89247924459-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:38.706105 master-0 kubenswrapper[13984]: I0312 12:45:38.706059 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/92e2764a-7fca-4b4a-ae89-131f181cdeb9-etc-swift\") pod \"swift-storage-0\" (UID: \"92e2764a-7fca-4b4a-ae89-131f181cdeb9\") " pod="openstack/swift-storage-0" Mar 12 12:45:39.002661 master-0 kubenswrapper[13984]: I0312 12:45:39.002424 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 12 12:45:39.027120 master-0 kubenswrapper[13984]: I0312 12:45:39.027052 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-2mjxx" event={"ID":"5767f9cc-96f2-4309-a6da-e89247924459","Type":"ContainerDied","Data":"26c3fb47654e05adf7ddf25bc12489f3503b85020234f85cda8f5be4961dd9c4"} Mar 12 12:45:39.027240 master-0 kubenswrapper[13984]: I0312 12:45:39.027130 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26c3fb47654e05adf7ddf25bc12489f3503b85020234f85cda8f5be4961dd9c4" Mar 12 12:45:39.027240 master-0 kubenswrapper[13984]: I0312 12:45:39.027068 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-2mjxx" Mar 12 12:45:39.602348 master-0 kubenswrapper[13984]: I0312 12:45:39.602273 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:39.689807 master-0 kubenswrapper[13984]: W0312 12:45:39.689748 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92e2764a_7fca_4b4a_ae89_131f181cdeb9.slice/crio-90621b13acaf38a192157134580694b67d558f73872da532e4d8735dd0264d52 WatchSource:0}: Error finding container 90621b13acaf38a192157134580694b67d558f73872da532e4d8735dd0264d52: Status 404 returned error can't find the container with id 90621b13acaf38a192157134580694b67d558f73872da532e4d8735dd0264d52 Mar 12 12:45:39.691408 master-0 kubenswrapper[13984]: I0312 12:45:39.691342 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 12 12:45:39.737688 master-0 kubenswrapper[13984]: I0312 12:45:39.737630 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkpkm\" (UniqueName: \"kubernetes.io/projected/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-kube-api-access-vkpkm\") pod \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " Mar 12 12:45:39.738362 master-0 kubenswrapper[13984]: I0312 12:45:39.738333 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-operator-scripts\") pod \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\" (UID: \"8fc04a62-dfbf-401b-88ba-42dcf4acfaac\") " Mar 12 12:45:39.738823 master-0 kubenswrapper[13984]: I0312 12:45:39.738781 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fc04a62-dfbf-401b-88ba-42dcf4acfaac" (UID: "8fc04a62-dfbf-401b-88ba-42dcf4acfaac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:45:39.743506 master-0 kubenswrapper[13984]: I0312 12:45:39.740000 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:39.743506 master-0 kubenswrapper[13984]: I0312 12:45:39.741741 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-kube-api-access-vkpkm" (OuterVolumeSpecName: "kube-api-access-vkpkm") pod "8fc04a62-dfbf-401b-88ba-42dcf4acfaac" (UID: "8fc04a62-dfbf-401b-88ba-42dcf4acfaac"). InnerVolumeSpecName "kube-api-access-vkpkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:45:39.842508 master-0 kubenswrapper[13984]: I0312 12:45:39.842448 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkpkm\" (UniqueName: \"kubernetes.io/projected/8fc04a62-dfbf-401b-88ba-42dcf4acfaac-kube-api-access-vkpkm\") on node \"master-0\" DevicePath \"\"" Mar 12 12:45:40.039270 master-0 kubenswrapper[13984]: I0312 12:45:40.039210 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"90621b13acaf38a192157134580694b67d558f73872da532e4d8735dd0264d52"} Mar 12 12:45:40.041584 master-0 kubenswrapper[13984]: I0312 12:45:40.041521 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-vcdjd" event={"ID":"8fc04a62-dfbf-401b-88ba-42dcf4acfaac","Type":"ContainerDied","Data":"0118051cb97ce8103037e742ca0b43f7a290573ece27483fb5135d9088778f09"} Mar 12 12:45:40.041682 master-0 kubenswrapper[13984]: I0312 12:45:40.041595 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0118051cb97ce8103037e742ca0b43f7a290573ece27483fb5135d9088778f09" Mar 12 12:45:40.042629 master-0 kubenswrapper[13984]: I0312 12:45:40.042589 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-vcdjd" Mar 12 12:45:41.056764 master-0 kubenswrapper[13984]: I0312 12:45:41.054121 13984 generic.go:334] "Generic (PLEG): container finished" podID="ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83" containerID="b54f88294904df46f0848fd01fac6018f1cdf4c4066102661c571b0f08ff4a6c" exitCode=0 Mar 12 12:45:41.056764 master-0 kubenswrapper[13984]: I0312 12:45:41.054189 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83","Type":"ContainerDied","Data":"b54f88294904df46f0848fd01fac6018f1cdf4c4066102661c571b0f08ff4a6c"} Mar 12 12:45:41.061511 master-0 kubenswrapper[13984]: I0312 12:45:41.058079 13984 generic.go:334] "Generic (PLEG): container finished" podID="f328e3e3-f9b4-4a88-9883-694d89c182f7" containerID="733e368e6e05be7e45d1f385392d95b4c6f01e221666f2ea77cca3a0fafa2157" exitCode=0 Mar 12 12:45:41.061511 master-0 kubenswrapper[13984]: I0312 12:45:41.058141 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f328e3e3-f9b4-4a88-9883-694d89c182f7","Type":"ContainerDied","Data":"733e368e6e05be7e45d1f385392d95b4c6f01e221666f2ea77cca3a0fafa2157"} Mar 12 12:45:41.467797 master-0 kubenswrapper[13984]: I0312 12:45:41.467761 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 12 12:45:42.082363 master-0 kubenswrapper[13984]: I0312 12:45:42.082306 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"f328e3e3-f9b4-4a88-9883-694d89c182f7","Type":"ContainerStarted","Data":"2aaa5fe2c14968a48902c0f98df03dc322121d1725888041da54256d0491e817"} Mar 12 12:45:42.083115 master-0 kubenswrapper[13984]: I0312 12:45:42.082593 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 12 12:45:42.085734 master-0 kubenswrapper[13984]: I0312 12:45:42.085455 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"9d898a8819e1171e6a164b4d8c1c903af76645a1e6bfb37be90d923dff79e82f"} Mar 12 12:45:42.085734 master-0 kubenswrapper[13984]: I0312 12:45:42.085495 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"47a3fdb930583db794226fc039292b05ca097ab01e6b57abaf013df8923796f6"} Mar 12 12:45:42.085734 master-0 kubenswrapper[13984]: I0312 12:45:42.085505 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"07e5eec2bb5d05cb731ce7bd52d333bad4167c13bf17135a300e34ea603d7b72"} Mar 12 12:45:42.085734 master-0 kubenswrapper[13984]: I0312 12:45:42.085514 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"879d6e90118a16f90a2daaf41ca8f37f18aaf0be60d54c62e92e2f49bb596847"} Mar 12 12:45:42.087645 master-0 kubenswrapper[13984]: I0312 12:45:42.087604 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83","Type":"ContainerStarted","Data":"1202350d6e01a07bc5da58da81fedde0b0c83ec24a2fa2cf0aff243968ef3292"} Mar 12 12:45:42.088026 master-0 kubenswrapper[13984]: I0312 12:45:42.087960 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:45:42.114755 master-0 kubenswrapper[13984]: I0312 12:45:42.114669 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.271010128 podStartE2EDuration="58.114638275s" podCreationTimestamp="2026-03-12 12:44:44 +0000 UTC" firstStartedPulling="2026-03-12 12:45:02.382146106 +0000 UTC m=+1234.580161598" lastFinishedPulling="2026-03-12 12:45:07.225774253 +0000 UTC m=+1239.423789745" observedRunningTime="2026-03-12 12:45:42.105787838 +0000 UTC m=+1274.303803340" watchObservedRunningTime="2026-03-12 12:45:42.114638275 +0000 UTC m=+1274.312653767" Mar 12 12:45:42.141673 master-0 kubenswrapper[13984]: I0312 12:45:42.140751 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.542828475 podStartE2EDuration="58.140730566s" podCreationTimestamp="2026-03-12 12:44:44 +0000 UTC" firstStartedPulling="2026-03-12 12:45:01.663681735 +0000 UTC m=+1233.861697227" lastFinishedPulling="2026-03-12 12:45:07.261583826 +0000 UTC m=+1239.459599318" observedRunningTime="2026-03-12 12:45:42.140120952 +0000 UTC m=+1274.338136444" watchObservedRunningTime="2026-03-12 12:45:42.140730566 +0000 UTC m=+1274.338746068" Mar 12 12:45:43.895973 master-0 kubenswrapper[13984]: I0312 12:45:43.895856 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-zt2nr"] Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: E0312 12:45:43.896392 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b671c7b6-86a7-4134-a072-70b23814f541" containerName="mariadb-account-create-update" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: I0312 12:45:43.896410 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b671c7b6-86a7-4134-a072-70b23814f541" containerName="mariadb-account-create-update" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: E0312 12:45:43.896439 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a41ec0-ae93-43bf-97a1-acda5b50ee55" containerName="mariadb-database-create" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: I0312 12:45:43.896449 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a41ec0-ae93-43bf-97a1-acda5b50ee55" containerName="mariadb-database-create" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: E0312 12:45:43.896494 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5767f9cc-96f2-4309-a6da-e89247924459" containerName="swift-ring-rebalance" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: I0312 12:45:43.896504 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="5767f9cc-96f2-4309-a6da-e89247924459" containerName="swift-ring-rebalance" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: E0312 12:45:43.896531 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fc04a62-dfbf-401b-88ba-42dcf4acfaac" containerName="mariadb-account-create-update" Mar 12 12:45:43.896563 master-0 kubenswrapper[13984]: I0312 12:45:43.896539 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fc04a62-dfbf-401b-88ba-42dcf4acfaac" containerName="mariadb-account-create-update" Mar 12 12:45:43.896839 master-0 kubenswrapper[13984]: I0312 12:45:43.896808 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fc04a62-dfbf-401b-88ba-42dcf4acfaac" containerName="mariadb-account-create-update" Mar 12 12:45:43.896877 master-0 kubenswrapper[13984]: I0312 12:45:43.896847 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a41ec0-ae93-43bf-97a1-acda5b50ee55" containerName="mariadb-database-create" Mar 12 12:45:43.896877 master-0 kubenswrapper[13984]: I0312 12:45:43.896861 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="5767f9cc-96f2-4309-a6da-e89247924459" containerName="swift-ring-rebalance" Mar 12 12:45:43.896948 master-0 kubenswrapper[13984]: I0312 12:45:43.896879 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="b671c7b6-86a7-4134-a072-70b23814f541" containerName="mariadb-account-create-update" Mar 12 12:45:43.897735 master-0 kubenswrapper[13984]: I0312 12:45:43.897710 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:43.904352 master-0 kubenswrapper[13984]: I0312 12:45:43.904310 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-config-data" Mar 12 12:45:43.905972 master-0 kubenswrapper[13984]: I0312 12:45:43.905928 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zt2nr"] Mar 12 12:45:43.945014 master-0 kubenswrapper[13984]: I0312 12:45:43.944951 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-db-sync-config-data\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:43.945144 master-0 kubenswrapper[13984]: I0312 12:45:43.945054 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-config-data\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:43.945144 master-0 kubenswrapper[13984]: I0312 12:45:43.945091 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-combined-ca-bundle\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:43.945144 master-0 kubenswrapper[13984]: I0312 12:45:43.945114 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhgc6\" (UniqueName: \"kubernetes.io/projected/f56d283a-8eb6-4927-bf3c-3145cc96ee28-kube-api-access-fhgc6\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.048541 master-0 kubenswrapper[13984]: I0312 12:45:44.048469 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-db-sync-config-data\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.048828 master-0 kubenswrapper[13984]: I0312 12:45:44.048623 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-config-data\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.048828 master-0 kubenswrapper[13984]: I0312 12:45:44.048659 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-combined-ca-bundle\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.048828 master-0 kubenswrapper[13984]: I0312 12:45:44.048683 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhgc6\" (UniqueName: \"kubernetes.io/projected/f56d283a-8eb6-4927-bf3c-3145cc96ee28-kube-api-access-fhgc6\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.054169 master-0 kubenswrapper[13984]: I0312 12:45:44.054094 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-combined-ca-bundle\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.055035 master-0 kubenswrapper[13984]: I0312 12:45:44.054952 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-db-sync-config-data\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.057051 master-0 kubenswrapper[13984]: I0312 12:45:44.056952 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-config-data\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.078283 master-0 kubenswrapper[13984]: I0312 12:45:44.075911 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhgc6\" (UniqueName: \"kubernetes.io/projected/f56d283a-8eb6-4927-bf3c-3145cc96ee28-kube-api-access-fhgc6\") pod \"glance-db-sync-zt2nr\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.113082 master-0 kubenswrapper[13984]: I0312 12:45:44.113029 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"3855ed9c48227b4cc82cb0c95b59d8157a225ef7632c1e2fe27f8d1564b8ee7c"} Mar 12 12:45:44.113381 master-0 kubenswrapper[13984]: I0312 12:45:44.113362 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"311a47902fa711903a2aa772db49c6e43a58c69ecb1728ec77a798faadf0781b"} Mar 12 12:45:44.113520 master-0 kubenswrapper[13984]: I0312 12:45:44.113474 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"5a883acc6d381f7d8b208b1ce63eea671b627ffd5fe0edbe5aef37639633f7fe"} Mar 12 12:45:44.113633 master-0 kubenswrapper[13984]: I0312 12:45:44.113616 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"e4fc3df936d1a6d48ad8d54959975d9da40a10bab829fb957ca4be3e2b504b4d"} Mar 12 12:45:44.239734 master-0 kubenswrapper[13984]: I0312 12:45:44.239663 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zt2nr" Mar 12 12:45:44.822301 master-0 kubenswrapper[13984]: I0312 12:45:44.822253 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-zt2nr"] Mar 12 12:45:45.126373 master-0 kubenswrapper[13984]: I0312 12:45:45.125969 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zt2nr" event={"ID":"f56d283a-8eb6-4927-bf3c-3145cc96ee28","Type":"ContainerStarted","Data":"b2164e539568c2b4717d8a389f4e8294960f0def7920d6ddf5372d1bd95ae683"} Mar 12 12:45:45.133918 master-0 kubenswrapper[13984]: I0312 12:45:45.133850 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"05ee006d1f6bb7d775416a469ecb5dd23aa323c019423223678d9fd0a316143d"} Mar 12 12:45:45.198449 master-0 kubenswrapper[13984]: I0312 12:45:45.198141 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7dvg9" podUID="f5cd50ae-2194-4717-96f0-47b3d353c8b1" containerName="ovn-controller" probeResult="failure" output=< Mar 12 12:45:45.198449 master-0 kubenswrapper[13984]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 12:45:45.198449 master-0 kubenswrapper[13984]: > Mar 12 12:45:46.148055 master-0 kubenswrapper[13984]: I0312 12:45:46.147726 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"c84eed9b134cfcc7f8295576bd46e9b8125e668af8c9aae51caf3c18bf4b3496"} Mar 12 12:45:46.148055 master-0 kubenswrapper[13984]: I0312 12:45:46.147778 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"c49251b67d430e11c9262103fad8cc6414179988a2cf83b2daf52aae46f9c47b"} Mar 12 12:45:46.148055 master-0 kubenswrapper[13984]: I0312 12:45:46.147789 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"45d747c7bb59e2df29a0eee5939bc914dee7b39f3309e29fe4e49f57be7f2ebf"} Mar 12 12:45:46.148055 master-0 kubenswrapper[13984]: I0312 12:45:46.147797 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"47403f129c8a047ea6299b3bbd4bdf284cc937fe7b16a3cdcf4a67e901f840ec"} Mar 12 12:45:46.148055 master-0 kubenswrapper[13984]: I0312 12:45:46.147810 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"d4db78ed45eb5efe0a5253d31c09cd15f95300e60aced8bdd407ee10d88aea45"} Mar 12 12:45:46.148055 master-0 kubenswrapper[13984]: I0312 12:45:46.147818 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"92e2764a-7fca-4b4a-ae89-131f181cdeb9","Type":"ContainerStarted","Data":"d9c08e7c90c8ac90b12936e3b4b4300d30b778a7c8479b88c80c9e54f027303e"} Mar 12 12:45:46.202346 master-0 kubenswrapper[13984]: I0312 12:45:46.202262 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=21.07166009 podStartE2EDuration="26.202240253s" podCreationTimestamp="2026-03-12 12:45:20 +0000 UTC" firstStartedPulling="2026-03-12 12:45:39.691468841 +0000 UTC m=+1271.889484333" lastFinishedPulling="2026-03-12 12:45:44.822049004 +0000 UTC m=+1277.020064496" observedRunningTime="2026-03-12 12:45:46.182511941 +0000 UTC m=+1278.380527433" watchObservedRunningTime="2026-03-12 12:45:46.202240253 +0000 UTC m=+1278.400255745" Mar 12 12:45:46.516492 master-0 kubenswrapper[13984]: I0312 12:45:46.514386 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cdcd69d47-blzgv"] Mar 12 12:45:46.528575 master-0 kubenswrapper[13984]: I0312 12:45:46.525303 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.528575 master-0 kubenswrapper[13984]: I0312 12:45:46.527561 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 12 12:45:46.561915 master-0 kubenswrapper[13984]: I0312 12:45:46.551817 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cdcd69d47-blzgv"] Mar 12 12:45:46.713166 master-0 kubenswrapper[13984]: I0312 12:45:46.713093 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-sb\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.713338 master-0 kubenswrapper[13984]: I0312 12:45:46.713248 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-nb\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.713338 master-0 kubenswrapper[13984]: I0312 12:45:46.713312 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-swift-storage-0\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.713413 master-0 kubenswrapper[13984]: I0312 12:45:46.713338 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc7x7\" (UniqueName: \"kubernetes.io/projected/e90795eb-e2e0-44f9-8e28-cece07a4230e-kube-api-access-rc7x7\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.713413 master-0 kubenswrapper[13984]: I0312 12:45:46.713399 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-config\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.713492 master-0 kubenswrapper[13984]: I0312 12:45:46.713459 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-svc\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.815274 master-0 kubenswrapper[13984]: I0312 12:45:46.815153 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-nb\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.815274 master-0 kubenswrapper[13984]: I0312 12:45:46.815229 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-swift-storage-0\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.815274 master-0 kubenswrapper[13984]: I0312 12:45:46.815253 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc7x7\" (UniqueName: \"kubernetes.io/projected/e90795eb-e2e0-44f9-8e28-cece07a4230e-kube-api-access-rc7x7\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.815582 master-0 kubenswrapper[13984]: I0312 12:45:46.815309 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-config\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.815582 master-0 kubenswrapper[13984]: I0312 12:45:46.815355 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-svc\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.815582 master-0 kubenswrapper[13984]: I0312 12:45:46.815380 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-sb\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.816276 master-0 kubenswrapper[13984]: I0312 12:45:46.816241 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-sb\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.816399 master-0 kubenswrapper[13984]: I0312 12:45:46.816351 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-config\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.816528 master-0 kubenswrapper[13984]: I0312 12:45:46.816469 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-nb\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.816605 master-0 kubenswrapper[13984]: I0312 12:45:46.816499 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-swift-storage-0\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.816798 master-0 kubenswrapper[13984]: I0312 12:45:46.816730 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-svc\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.830450 master-0 kubenswrapper[13984]: I0312 12:45:46.830420 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc7x7\" (UniqueName: \"kubernetes.io/projected/e90795eb-e2e0-44f9-8e28-cece07a4230e-kube-api-access-rc7x7\") pod \"dnsmasq-dns-7cdcd69d47-blzgv\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:46.878887 master-0 kubenswrapper[13984]: I0312 12:45:46.878820 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:47.338926 master-0 kubenswrapper[13984]: I0312 12:45:47.338880 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cdcd69d47-blzgv"] Mar 12 12:45:48.207369 master-0 kubenswrapper[13984]: I0312 12:45:48.207327 13984 generic.go:334] "Generic (PLEG): container finished" podID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerID="f7e33c7c00afd4252c842a3cf5b45d23b8311281def545680df537f481cbaea6" exitCode=0 Mar 12 12:45:48.207623 master-0 kubenswrapper[13984]: I0312 12:45:48.207373 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" event={"ID":"e90795eb-e2e0-44f9-8e28-cece07a4230e","Type":"ContainerDied","Data":"f7e33c7c00afd4252c842a3cf5b45d23b8311281def545680df537f481cbaea6"} Mar 12 12:45:48.207724 master-0 kubenswrapper[13984]: I0312 12:45:48.207706 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" event={"ID":"e90795eb-e2e0-44f9-8e28-cece07a4230e","Type":"ContainerStarted","Data":"01d9dc37a611215f8476111b5d1f9320393922c76695a7186b8722bb5537cc6e"} Mar 12 12:45:49.228925 master-0 kubenswrapper[13984]: I0312 12:45:49.228867 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" event={"ID":"e90795eb-e2e0-44f9-8e28-cece07a4230e","Type":"ContainerStarted","Data":"5e0ef48e5d179583437b13e96b72a8564e98578539e7336b091e6f744caea2aa"} Mar 12 12:45:49.229609 master-0 kubenswrapper[13984]: I0312 12:45:49.229587 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:49.266216 master-0 kubenswrapper[13984]: I0312 12:45:49.266119 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" podStartSLOduration=3.266093496 podStartE2EDuration="3.266093496s" podCreationTimestamp="2026-03-12 12:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:45:49.249764574 +0000 UTC m=+1281.447780066" watchObservedRunningTime="2026-03-12 12:45:49.266093496 +0000 UTC m=+1281.464108988" Mar 12 12:45:50.212267 master-0 kubenswrapper[13984]: I0312 12:45:50.212201 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-7dvg9" podUID="f5cd50ae-2194-4717-96f0-47b3d353c8b1" containerName="ovn-controller" probeResult="failure" output=< Mar 12 12:45:50.212267 master-0 kubenswrapper[13984]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 12 12:45:50.212267 master-0 kubenswrapper[13984]: > Mar 12 12:45:50.232784 master-0 kubenswrapper[13984]: I0312 12:45:50.232730 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:45:50.234716 master-0 kubenswrapper[13984]: I0312 12:45:50.233950 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-vttwx" Mar 12 12:45:51.261860 master-0 kubenswrapper[13984]: I0312 12:45:51.261801 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-7dvg9-config-zx44v"] Mar 12 12:45:51.263498 master-0 kubenswrapper[13984]: I0312 12:45:51.263452 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.266829 master-0 kubenswrapper[13984]: I0312 12:45:51.266606 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 12 12:45:51.270451 master-0 kubenswrapper[13984]: I0312 12:45:51.270405 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7dvg9-config-zx44v"] Mar 12 12:45:51.394213 master-0 kubenswrapper[13984]: I0312 12:45:51.393761 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-additional-scripts\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.394213 master-0 kubenswrapper[13984]: I0312 12:45:51.393935 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run-ovn\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.394213 master-0 kubenswrapper[13984]: I0312 12:45:51.393979 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-log-ovn\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.394213 master-0 kubenswrapper[13984]: I0312 12:45:51.394024 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-scripts\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.394213 master-0 kubenswrapper[13984]: I0312 12:45:51.394189 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlw8\" (UniqueName: \"kubernetes.io/projected/203e5cda-262b-4776-8012-99314b5df833-kube-api-access-twlw8\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.394544 master-0 kubenswrapper[13984]: I0312 12:45:51.394381 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.496611 master-0 kubenswrapper[13984]: I0312 12:45:51.496547 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-additional-scripts\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.496883 master-0 kubenswrapper[13984]: I0312 12:45:51.496692 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run-ovn\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.496883 master-0 kubenswrapper[13984]: I0312 12:45:51.496722 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-log-ovn\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.496883 master-0 kubenswrapper[13984]: I0312 12:45:51.496759 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-scripts\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.497028 master-0 kubenswrapper[13984]: I0312 12:45:51.496929 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run-ovn\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.497028 master-0 kubenswrapper[13984]: I0312 12:45:51.496961 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twlw8\" (UniqueName: \"kubernetes.io/projected/203e5cda-262b-4776-8012-99314b5df833-kube-api-access-twlw8\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.497170 master-0 kubenswrapper[13984]: I0312 12:45:51.497114 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.497319 master-0 kubenswrapper[13984]: I0312 12:45:51.497288 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.497383 master-0 kubenswrapper[13984]: I0312 12:45:51.497344 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-log-ovn\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.498046 master-0 kubenswrapper[13984]: I0312 12:45:51.498013 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-additional-scripts\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.498975 master-0 kubenswrapper[13984]: I0312 12:45:51.498927 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-scripts\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.530659 master-0 kubenswrapper[13984]: I0312 12:45:51.530543 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlw8\" (UniqueName: \"kubernetes.io/projected/203e5cda-262b-4776-8012-99314b5df833-kube-api-access-twlw8\") pod \"ovn-controller-7dvg9-config-zx44v\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.613389 master-0 kubenswrapper[13984]: I0312 12:45:51.609668 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:45:51.844387 master-0 kubenswrapper[13984]: I0312 12:45:51.844333 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 12 12:45:52.157146 master-0 kubenswrapper[13984]: I0312 12:45:52.144263 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-cpjvq"] Mar 12 12:45:52.157146 master-0 kubenswrapper[13984]: I0312 12:45:52.153942 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.157146 master-0 kubenswrapper[13984]: I0312 12:45:52.156527 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-7dvg9-config-zx44v"] Mar 12 12:45:52.169543 master-0 kubenswrapper[13984]: I0312 12:45:52.168674 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cpjvq"] Mar 12 12:45:52.279955 master-0 kubenswrapper[13984]: I0312 12:45:52.279901 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7dvg9-config-zx44v" event={"ID":"203e5cda-262b-4776-8012-99314b5df833","Type":"ContainerStarted","Data":"995e64b44d8587a96182a7654f5ebe595136a282c315a19aef6f273bf909938d"} Mar 12 12:45:52.301575 master-0 kubenswrapper[13984]: I0312 12:45:52.301502 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-74a2-account-create-update-xbbbk"] Mar 12 12:45:52.303900 master-0 kubenswrapper[13984]: I0312 12:45:52.303858 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.307423 master-0 kubenswrapper[13984]: I0312 12:45:52.307377 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 12 12:45:52.317318 master-0 kubenswrapper[13984]: I0312 12:45:52.317170 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-74a2-account-create-update-xbbbk"] Mar 12 12:45:52.333130 master-0 kubenswrapper[13984]: I0312 12:45:52.333087 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sbbq\" (UniqueName: \"kubernetes.io/projected/6567de78-8717-4214-8b7e-cd66e2ca2a24-kube-api-access-8sbbq\") pod \"cinder-db-create-cpjvq\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.333579 master-0 kubenswrapper[13984]: I0312 12:45:52.333554 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6567de78-8717-4214-8b7e-cd66e2ca2a24-operator-scripts\") pod \"cinder-db-create-cpjvq\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.436053 master-0 kubenswrapper[13984]: I0312 12:45:52.435791 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnq68\" (UniqueName: \"kubernetes.io/projected/6d81f389-a57f-4baa-9df7-21072a39b775-kube-api-access-cnq68\") pod \"cinder-74a2-account-create-update-xbbbk\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.436338 master-0 kubenswrapper[13984]: I0312 12:45:52.435855 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d81f389-a57f-4baa-9df7-21072a39b775-operator-scripts\") pod \"cinder-74a2-account-create-update-xbbbk\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.437156 master-0 kubenswrapper[13984]: I0312 12:45:52.436839 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6567de78-8717-4214-8b7e-cd66e2ca2a24-operator-scripts\") pod \"cinder-db-create-cpjvq\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.437809 master-0 kubenswrapper[13984]: I0312 12:45:52.437656 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6567de78-8717-4214-8b7e-cd66e2ca2a24-operator-scripts\") pod \"cinder-db-create-cpjvq\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.438612 master-0 kubenswrapper[13984]: I0312 12:45:52.438520 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sbbq\" (UniqueName: \"kubernetes.io/projected/6567de78-8717-4214-8b7e-cd66e2ca2a24-kube-api-access-8sbbq\") pod \"cinder-db-create-cpjvq\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.456556 master-0 kubenswrapper[13984]: I0312 12:45:52.456519 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sbbq\" (UniqueName: \"kubernetes.io/projected/6567de78-8717-4214-8b7e-cd66e2ca2a24-kube-api-access-8sbbq\") pod \"cinder-db-create-cpjvq\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.526510 master-0 kubenswrapper[13984]: I0312 12:45:52.521236 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cpjvq" Mar 12 12:45:52.527595 master-0 kubenswrapper[13984]: I0312 12:45:52.527532 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-zfdwn"] Mar 12 12:45:52.547598 master-0 kubenswrapper[13984]: I0312 12:45:52.547466 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnq68\" (UniqueName: \"kubernetes.io/projected/6d81f389-a57f-4baa-9df7-21072a39b775-kube-api-access-cnq68\") pod \"cinder-74a2-account-create-update-xbbbk\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.559499 master-0 kubenswrapper[13984]: I0312 12:45:52.550524 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d81f389-a57f-4baa-9df7-21072a39b775-operator-scripts\") pod \"cinder-74a2-account-create-update-xbbbk\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.559499 master-0 kubenswrapper[13984]: I0312 12:45:52.551684 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d81f389-a57f-4baa-9df7-21072a39b775-operator-scripts\") pod \"cinder-74a2-account-create-update-xbbbk\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.562513 master-0 kubenswrapper[13984]: I0312 12:45:52.562073 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.580548 master-0 kubenswrapper[13984]: I0312 12:45:52.580489 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnq68\" (UniqueName: \"kubernetes.io/projected/6d81f389-a57f-4baa-9df7-21072a39b775-kube-api-access-cnq68\") pod \"cinder-74a2-account-create-update-xbbbk\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.585640 master-0 kubenswrapper[13984]: I0312 12:45:52.585594 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zfdwn"] Mar 12 12:45:52.639565 master-0 kubenswrapper[13984]: I0312 12:45:52.638052 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-16d8-account-create-update-lp6v6"] Mar 12 12:45:52.639565 master-0 kubenswrapper[13984]: I0312 12:45:52.639444 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:52.641515 master-0 kubenswrapper[13984]: I0312 12:45:52.641118 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 12 12:45:52.654259 master-0 kubenswrapper[13984]: I0312 12:45:52.654187 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb892f56-bc0f-4500-903c-49c41d16945b-operator-scripts\") pod \"neutron-db-create-zfdwn\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.654523 master-0 kubenswrapper[13984]: I0312 12:45:52.654282 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8gh\" (UniqueName: \"kubernetes.io/projected/eb892f56-bc0f-4500-903c-49c41d16945b-kube-api-access-nf8gh\") pod \"neutron-db-create-zfdwn\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.664419 master-0 kubenswrapper[13984]: I0312 12:45:52.664004 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6rxfx"] Mar 12 12:45:52.667544 master-0 kubenswrapper[13984]: I0312 12:45:52.667055 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.684225 master-0 kubenswrapper[13984]: I0312 12:45:52.683494 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:45:52.688870 master-0 kubenswrapper[13984]: I0312 12:45:52.685533 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 12:45:52.688870 master-0 kubenswrapper[13984]: I0312 12:45:52.685658 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 12:45:52.688870 master-0 kubenswrapper[13984]: I0312 12:45:52.685769 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 12:45:52.711191 master-0 kubenswrapper[13984]: I0312 12:45:52.709157 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-16d8-account-create-update-lp6v6"] Mar 12 12:45:52.734708 master-0 kubenswrapper[13984]: I0312 12:45:52.734584 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6rxfx"] Mar 12 12:45:52.757028 master-0 kubenswrapper[13984]: I0312 12:45:52.756870 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb892f56-bc0f-4500-903c-49c41d16945b-operator-scripts\") pod \"neutron-db-create-zfdwn\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.757028 master-0 kubenswrapper[13984]: I0312 12:45:52.756927 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8gh\" (UniqueName: \"kubernetes.io/projected/eb892f56-bc0f-4500-903c-49c41d16945b-kube-api-access-nf8gh\") pod \"neutron-db-create-zfdwn\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.757028 master-0 kubenswrapper[13984]: I0312 12:45:52.756958 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrg5w\" (UniqueName: \"kubernetes.io/projected/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-kube-api-access-rrg5w\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.757028 master-0 kubenswrapper[13984]: I0312 12:45:52.756980 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2eefae0-724a-451b-b39b-753fa2551ba1-operator-scripts\") pod \"neutron-16d8-account-create-update-lp6v6\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:52.757028 master-0 kubenswrapper[13984]: I0312 12:45:52.757008 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjmdx\" (UniqueName: \"kubernetes.io/projected/f2eefae0-724a-451b-b39b-753fa2551ba1-kube-api-access-bjmdx\") pod \"neutron-16d8-account-create-update-lp6v6\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:52.757028 master-0 kubenswrapper[13984]: I0312 12:45:52.757047 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-combined-ca-bundle\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.757398 master-0 kubenswrapper[13984]: I0312 12:45:52.757085 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-config-data\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.759211 master-0 kubenswrapper[13984]: I0312 12:45:52.757962 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb892f56-bc0f-4500-903c-49c41d16945b-operator-scripts\") pod \"neutron-db-create-zfdwn\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.779393 master-0 kubenswrapper[13984]: I0312 12:45:52.777515 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8gh\" (UniqueName: \"kubernetes.io/projected/eb892f56-bc0f-4500-903c-49c41d16945b-kube-api-access-nf8gh\") pod \"neutron-db-create-zfdwn\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.806333 master-0 kubenswrapper[13984]: I0312 12:45:52.804337 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfdwn" Mar 12 12:45:52.859551 master-0 kubenswrapper[13984]: I0312 12:45:52.859439 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-config-data\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.859888 master-0 kubenswrapper[13984]: I0312 12:45:52.859852 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrg5w\" (UniqueName: \"kubernetes.io/projected/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-kube-api-access-rrg5w\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.859978 master-0 kubenswrapper[13984]: I0312 12:45:52.859903 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2eefae0-724a-451b-b39b-753fa2551ba1-operator-scripts\") pod \"neutron-16d8-account-create-update-lp6v6\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:52.859978 master-0 kubenswrapper[13984]: I0312 12:45:52.859944 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjmdx\" (UniqueName: \"kubernetes.io/projected/f2eefae0-724a-451b-b39b-753fa2551ba1-kube-api-access-bjmdx\") pod \"neutron-16d8-account-create-update-lp6v6\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:52.860089 master-0 kubenswrapper[13984]: I0312 12:45:52.860007 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-combined-ca-bundle\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.864580 master-0 kubenswrapper[13984]: I0312 12:45:52.862029 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2eefae0-724a-451b-b39b-753fa2551ba1-operator-scripts\") pod \"neutron-16d8-account-create-update-lp6v6\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:52.868407 master-0 kubenswrapper[13984]: I0312 12:45:52.867826 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-combined-ca-bundle\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.869432 master-0 kubenswrapper[13984]: I0312 12:45:52.869361 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-config-data\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.899332 master-0 kubenswrapper[13984]: I0312 12:45:52.899253 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrg5w\" (UniqueName: \"kubernetes.io/projected/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-kube-api-access-rrg5w\") pod \"keystone-db-sync-6rxfx\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:52.899548 master-0 kubenswrapper[13984]: I0312 12:45:52.899253 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjmdx\" (UniqueName: \"kubernetes.io/projected/f2eefae0-724a-451b-b39b-753fa2551ba1-kube-api-access-bjmdx\") pod \"neutron-16d8-account-create-update-lp6v6\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:53.126588 master-0 kubenswrapper[13984]: I0312 12:45:53.126519 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:45:53.178690 master-0 kubenswrapper[13984]: I0312 12:45:53.177094 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:45:53.222551 master-0 kubenswrapper[13984]: I0312 12:45:53.220017 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-cpjvq"] Mar 12 12:45:53.290787 master-0 kubenswrapper[13984]: I0312 12:45:53.290728 13984 generic.go:334] "Generic (PLEG): container finished" podID="203e5cda-262b-4776-8012-99314b5df833" containerID="7b13f30cb13c59942bbe65a5bd4ac18816f0e5964d563ec4664fe36103c129af" exitCode=0 Mar 12 12:45:53.290787 master-0 kubenswrapper[13984]: I0312 12:45:53.290772 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7dvg9-config-zx44v" event={"ID":"203e5cda-262b-4776-8012-99314b5df833","Type":"ContainerDied","Data":"7b13f30cb13c59942bbe65a5bd4ac18816f0e5964d563ec4664fe36103c129af"} Mar 12 12:45:53.381676 master-0 kubenswrapper[13984]: I0312 12:45:53.381099 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-74a2-account-create-update-xbbbk"] Mar 12 12:45:53.495718 master-0 kubenswrapper[13984]: I0312 12:45:53.495654 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-zfdwn"] Mar 12 12:45:53.650817 master-0 kubenswrapper[13984]: I0312 12:45:53.650767 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-16d8-account-create-update-lp6v6"] Mar 12 12:45:55.186056 master-0 kubenswrapper[13984]: I0312 12:45:55.185997 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-7dvg9" Mar 12 12:45:56.880264 master-0 kubenswrapper[13984]: I0312 12:45:56.880175 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:45:56.981430 master-0 kubenswrapper[13984]: I0312 12:45:56.980090 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-hbdd6"] Mar 12 12:45:56.981430 master-0 kubenswrapper[13984]: I0312 12:45:56.980589 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerName="dnsmasq-dns" containerID="cri-o://a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92" gracePeriod=10 Mar 12 12:46:00.201772 master-0 kubenswrapper[13984]: I0312 12:46:00.201697 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 12 12:46:00.416714 master-0 kubenswrapper[13984]: W0312 12:46:00.416665 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2eefae0_724a_451b_b39b_753fa2551ba1.slice/crio-215628fc1eb3d3ad647c0de892fe369ca6bc6761f7a9f18884bd53a0060a614e WatchSource:0}: Error finding container 215628fc1eb3d3ad647c0de892fe369ca6bc6761f7a9f18884bd53a0060a614e: Status 404 returned error can't find the container with id 215628fc1eb3d3ad647c0de892fe369ca6bc6761f7a9f18884bd53a0060a614e Mar 12 12:46:00.585814 master-0 kubenswrapper[13984]: I0312 12:46:00.585761 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:46:00.763217 master-0 kubenswrapper[13984]: I0312 12:46:00.763114 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run-ovn\") pod \"203e5cda-262b-4776-8012-99314b5df833\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.763289 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-log-ovn\") pod \"203e5cda-262b-4776-8012-99314b5df833\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.763385 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-additional-scripts\") pod \"203e5cda-262b-4776-8012-99314b5df833\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.763718 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run\") pod \"203e5cda-262b-4776-8012-99314b5df833\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.763778 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-scripts\") pod \"203e5cda-262b-4776-8012-99314b5df833\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.763898 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twlw8\" (UniqueName: \"kubernetes.io/projected/203e5cda-262b-4776-8012-99314b5df833-kube-api-access-twlw8\") pod \"203e5cda-262b-4776-8012-99314b5df833\" (UID: \"203e5cda-262b-4776-8012-99314b5df833\") " Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.763265 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "203e5cda-262b-4776-8012-99314b5df833" (UID: "203e5cda-262b-4776-8012-99314b5df833"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:46:00.767317 master-0 kubenswrapper[13984]: I0312 12:46:00.765965 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "203e5cda-262b-4776-8012-99314b5df833" (UID: "203e5cda-262b-4776-8012-99314b5df833"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:46:00.768513 master-0 kubenswrapper[13984]: I0312 12:46:00.768467 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "203e5cda-262b-4776-8012-99314b5df833" (UID: "203e5cda-262b-4776-8012-99314b5df833"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:00.771027 master-0 kubenswrapper[13984]: I0312 12:46:00.770881 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-scripts" (OuterVolumeSpecName: "scripts") pod "203e5cda-262b-4776-8012-99314b5df833" (UID: "203e5cda-262b-4776-8012-99314b5df833"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:00.771027 master-0 kubenswrapper[13984]: I0312 12:46:00.770987 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run" (OuterVolumeSpecName: "var-run") pod "203e5cda-262b-4776-8012-99314b5df833" (UID: "203e5cda-262b-4776-8012-99314b5df833"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:46:00.801144 master-0 kubenswrapper[13984]: I0312 12:46:00.800873 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/203e5cda-262b-4776-8012-99314b5df833-kube-api-access-twlw8" (OuterVolumeSpecName: "kube-api-access-twlw8") pod "203e5cda-262b-4776-8012-99314b5df833" (UID: "203e5cda-262b-4776-8012-99314b5df833"). InnerVolumeSpecName "kube-api-access-twlw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:00.866657 master-0 kubenswrapper[13984]: I0312 12:46:00.866596 13984 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:00.866657 master-0 kubenswrapper[13984]: I0312 12:46:00.866636 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:00.866657 master-0 kubenswrapper[13984]: I0312 12:46:00.866650 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twlw8\" (UniqueName: \"kubernetes.io/projected/203e5cda-262b-4776-8012-99314b5df833-kube-api-access-twlw8\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:00.866657 master-0 kubenswrapper[13984]: I0312 12:46:00.866663 13984 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:00.866657 master-0 kubenswrapper[13984]: I0312 12:46:00.866672 13984 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/203e5cda-262b-4776-8012-99314b5df833-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:00.866657 master-0 kubenswrapper[13984]: I0312 12:46:00.866681 13984 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/203e5cda-262b-4776-8012-99314b5df833-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:00.968372 master-0 kubenswrapper[13984]: I0312 12:46:00.968318 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:46:01.076970 master-0 kubenswrapper[13984]: I0312 12:46:01.073062 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmf7j\" (UniqueName: \"kubernetes.io/projected/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-kube-api-access-tmf7j\") pod \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " Mar 12 12:46:01.076970 master-0 kubenswrapper[13984]: I0312 12:46:01.073133 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-dns-svc\") pod \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " Mar 12 12:46:01.076970 master-0 kubenswrapper[13984]: I0312 12:46:01.073199 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-sb\") pod \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " Mar 12 12:46:01.076970 master-0 kubenswrapper[13984]: I0312 12:46:01.073235 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-config\") pod \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " Mar 12 12:46:01.076970 master-0 kubenswrapper[13984]: I0312 12:46:01.073324 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-nb\") pod \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\" (UID: \"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c\") " Mar 12 12:46:01.097501 master-0 kubenswrapper[13984]: I0312 12:46:01.090858 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-kube-api-access-tmf7j" (OuterVolumeSpecName: "kube-api-access-tmf7j") pod "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" (UID: "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c"). InnerVolumeSpecName "kube-api-access-tmf7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:01.112504 master-0 kubenswrapper[13984]: I0312 12:46:01.105075 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6rxfx"] Mar 12 12:46:01.185502 master-0 kubenswrapper[13984]: I0312 12:46:01.175752 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmf7j\" (UniqueName: \"kubernetes.io/projected/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-kube-api-access-tmf7j\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:01.215498 master-0 kubenswrapper[13984]: I0312 12:46:01.214899 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" (UID: "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:01.257893 master-0 kubenswrapper[13984]: I0312 12:46:01.257759 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-config" (OuterVolumeSpecName: "config") pod "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" (UID: "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:01.269176 master-0 kubenswrapper[13984]: I0312 12:46:01.261372 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" (UID: "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:01.269176 master-0 kubenswrapper[13984]: I0312 12:46:01.266240 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" (UID: "f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:01.285498 master-0 kubenswrapper[13984]: I0312 12:46:01.281774 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:01.285498 master-0 kubenswrapper[13984]: I0312 12:46:01.281841 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:01.285498 master-0 kubenswrapper[13984]: I0312 12:46:01.281854 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:01.285498 master-0 kubenswrapper[13984]: I0312 12:46:01.281864 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:01.392801 master-0 kubenswrapper[13984]: I0312 12:46:01.391739 13984 generic.go:334] "Generic (PLEG): container finished" podID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerID="a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92" exitCode=0 Mar 12 12:46:01.392801 master-0 kubenswrapper[13984]: I0312 12:46:01.391809 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" event={"ID":"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c","Type":"ContainerDied","Data":"a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92"} Mar 12 12:46:01.392801 master-0 kubenswrapper[13984]: I0312 12:46:01.391871 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" Mar 12 12:46:01.392801 master-0 kubenswrapper[13984]: I0312 12:46:01.391933 13984 scope.go:117] "RemoveContainer" containerID="a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92" Mar 12 12:46:01.392801 master-0 kubenswrapper[13984]: I0312 12:46:01.391899 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf8b865dc-hbdd6" event={"ID":"f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c","Type":"ContainerDied","Data":"f59f6bada1979cce50108ca95b76ad0f1aa4160d4e8bbee0c6f6ef94ff79e50b"} Mar 12 12:46:01.410965 master-0 kubenswrapper[13984]: I0312 12:46:01.410909 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-7dvg9-config-zx44v" event={"ID":"203e5cda-262b-4776-8012-99314b5df833","Type":"ContainerDied","Data":"995e64b44d8587a96182a7654f5ebe595136a282c315a19aef6f273bf909938d"} Mar 12 12:46:01.410965 master-0 kubenswrapper[13984]: I0312 12:46:01.410965 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="995e64b44d8587a96182a7654f5ebe595136a282c315a19aef6f273bf909938d" Mar 12 12:46:01.411146 master-0 kubenswrapper[13984]: I0312 12:46:01.411052 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-7dvg9-config-zx44v" Mar 12 12:46:01.438857 master-0 kubenswrapper[13984]: I0312 12:46:01.436797 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-16d8-account-create-update-lp6v6" event={"ID":"f2eefae0-724a-451b-b39b-753fa2551ba1","Type":"ContainerStarted","Data":"5ee0d474666ed7d476805f64bb18b2614147f7b3689af5a58eadab957f8be8a9"} Mar 12 12:46:01.438857 master-0 kubenswrapper[13984]: I0312 12:46:01.436856 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-16d8-account-create-update-lp6v6" event={"ID":"f2eefae0-724a-451b-b39b-753fa2551ba1","Type":"ContainerStarted","Data":"215628fc1eb3d3ad647c0de892fe369ca6bc6761f7a9f18884bd53a0060a614e"} Mar 12 12:46:01.441504 master-0 kubenswrapper[13984]: I0312 12:46:01.440046 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cpjvq" event={"ID":"6567de78-8717-4214-8b7e-cd66e2ca2a24","Type":"ContainerStarted","Data":"374390468a0e8fd71fc21611c25faefd9e0e97b2a32de3d0dc1dbf75d597581e"} Mar 12 12:46:01.441504 master-0 kubenswrapper[13984]: I0312 12:46:01.440160 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cpjvq" event={"ID":"6567de78-8717-4214-8b7e-cd66e2ca2a24","Type":"ContainerStarted","Data":"cc467e7db2fba9ad7c76d4e1f8ff4569fdc2a21fdeb5ae9efeb1d9c2cc8089d2"} Mar 12 12:46:01.449348 master-0 kubenswrapper[13984]: I0312 12:46:01.449044 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6rxfx" event={"ID":"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6","Type":"ContainerStarted","Data":"a899852a0cfc4329c3c96b0fde8062051e56168fda5fb8e13c692a095c6d5c8c"} Mar 12 12:46:01.453375 master-0 kubenswrapper[13984]: I0312 12:46:01.453323 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-74a2-account-create-update-xbbbk" event={"ID":"6d81f389-a57f-4baa-9df7-21072a39b775","Type":"ContainerStarted","Data":"7bb0efba5a778494cd5aebe304bbc3308f0cf15eb01166d3bc7e5d4db07374f5"} Mar 12 12:46:01.453498 master-0 kubenswrapper[13984]: I0312 12:46:01.453469 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-74a2-account-create-update-xbbbk" event={"ID":"6d81f389-a57f-4baa-9df7-21072a39b775","Type":"ContainerStarted","Data":"1b8ca91ad43de5f49774975d7fbd1fdc01c1a7412c11092d39bf1b3718c56362"} Mar 12 12:46:01.481447 master-0 kubenswrapper[13984]: I0312 12:46:01.460225 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfdwn" event={"ID":"eb892f56-bc0f-4500-903c-49c41d16945b","Type":"ContainerStarted","Data":"f9b9fe876e795dd1e21c83d1b1cecfb0938ee00a2d357e5a917ce8777963f28a"} Mar 12 12:46:01.481447 master-0 kubenswrapper[13984]: I0312 12:46:01.460317 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfdwn" event={"ID":"eb892f56-bc0f-4500-903c-49c41d16945b","Type":"ContainerStarted","Data":"9f876f490609103d90919ef2d40aee554c1912582ad0e84eaae81e89345470da"} Mar 12 12:46:01.499981 master-0 kubenswrapper[13984]: I0312 12:46:01.497850 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-16d8-account-create-update-lp6v6" podStartSLOduration=9.497821161 podStartE2EDuration="9.497821161s" podCreationTimestamp="2026-03-12 12:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:01.46274507 +0000 UTC m=+1293.660760562" watchObservedRunningTime="2026-03-12 12:46:01.497821161 +0000 UTC m=+1293.695836653" Mar 12 12:46:01.529004 master-0 kubenswrapper[13984]: I0312 12:46:01.522282 13984 scope.go:117] "RemoveContainer" containerID="48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f" Mar 12 12:46:01.532988 master-0 kubenswrapper[13984]: I0312 12:46:01.532519 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-74a2-account-create-update-xbbbk" podStartSLOduration=9.532496703 podStartE2EDuration="9.532496703s" podCreationTimestamp="2026-03-12 12:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:01.490292515 +0000 UTC m=+1293.688308007" watchObservedRunningTime="2026-03-12 12:46:01.532496703 +0000 UTC m=+1293.730512205" Mar 12 12:46:01.571324 master-0 kubenswrapper[13984]: I0312 12:46:01.571282 13984 scope.go:117] "RemoveContainer" containerID="a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92" Mar 12 12:46:01.577505 master-0 kubenswrapper[13984]: E0312 12:46:01.573168 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92\": container with ID starting with a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92 not found: ID does not exist" containerID="a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92" Mar 12 12:46:01.577505 master-0 kubenswrapper[13984]: I0312 12:46:01.573220 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92"} err="failed to get container status \"a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92\": rpc error: code = NotFound desc = could not find container \"a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92\": container with ID starting with a9fe0bf894c3fc71d71a144b1d77601e6224951672bcd7c65c478902e2d90f92 not found: ID does not exist" Mar 12 12:46:01.577505 master-0 kubenswrapper[13984]: I0312 12:46:01.573246 13984 scope.go:117] "RemoveContainer" containerID="48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f" Mar 12 12:46:01.577505 master-0 kubenswrapper[13984]: E0312 12:46:01.574209 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f\": container with ID starting with 48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f not found: ID does not exist" containerID="48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f" Mar 12 12:46:01.577505 master-0 kubenswrapper[13984]: I0312 12:46:01.574285 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f"} err="failed to get container status \"48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f\": rpc error: code = NotFound desc = could not find container \"48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f\": container with ID starting with 48c9b6d98db3a0a59f253049a049b5da27502d14780a1f4e861829ea8951fa5f not found: ID does not exist" Mar 12 12:46:01.583278 master-0 kubenswrapper[13984]: I0312 12:46:01.582448 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-create-zfdwn" podStartSLOduration=9.582418202 podStartE2EDuration="9.582418202s" podCreationTimestamp="2026-03-12 12:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:01.522789056 +0000 UTC m=+1293.720804558" watchObservedRunningTime="2026-03-12 12:46:01.582418202 +0000 UTC m=+1293.780433714" Mar 12 12:46:01.593071 master-0 kubenswrapper[13984]: I0312 12:46:01.592129 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-cpjvq" podStartSLOduration=9.592106459 podStartE2EDuration="9.592106459s" podCreationTimestamp="2026-03-12 12:45:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:01.570841621 +0000 UTC m=+1293.768857123" watchObservedRunningTime="2026-03-12 12:46:01.592106459 +0000 UTC m=+1293.790121951" Mar 12 12:46:01.616257 master-0 kubenswrapper[13984]: I0312 12:46:01.616190 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-hbdd6"] Mar 12 12:46:01.662220 master-0 kubenswrapper[13984]: I0312 12:46:01.661763 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf8b865dc-hbdd6"] Mar 12 12:46:01.743943 master-0 kubenswrapper[13984]: I0312 12:46:01.743864 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-7dvg9-config-zx44v"] Mar 12 12:46:01.764808 master-0 kubenswrapper[13984]: I0312 12:46:01.764731 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-7dvg9-config-zx44v"] Mar 12 12:46:01.996290 master-0 kubenswrapper[13984]: I0312 12:46:01.996189 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="203e5cda-262b-4776-8012-99314b5df833" path="/var/lib/kubelet/pods/203e5cda-262b-4776-8012-99314b5df833/volumes" Mar 12 12:46:01.997399 master-0 kubenswrapper[13984]: I0312 12:46:01.997378 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" path="/var/lib/kubelet/pods/f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c/volumes" Mar 12 12:46:02.482048 master-0 kubenswrapper[13984]: I0312 12:46:02.481997 13984 generic.go:334] "Generic (PLEG): container finished" podID="6d81f389-a57f-4baa-9df7-21072a39b775" containerID="7bb0efba5a778494cd5aebe304bbc3308f0cf15eb01166d3bc7e5d4db07374f5" exitCode=0 Mar 12 12:46:02.482627 master-0 kubenswrapper[13984]: I0312 12:46:02.482060 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-74a2-account-create-update-xbbbk" event={"ID":"6d81f389-a57f-4baa-9df7-21072a39b775","Type":"ContainerDied","Data":"7bb0efba5a778494cd5aebe304bbc3308f0cf15eb01166d3bc7e5d4db07374f5"} Mar 12 12:46:02.484607 master-0 kubenswrapper[13984]: I0312 12:46:02.484443 13984 generic.go:334] "Generic (PLEG): container finished" podID="eb892f56-bc0f-4500-903c-49c41d16945b" containerID="f9b9fe876e795dd1e21c83d1b1cecfb0938ee00a2d357e5a917ce8777963f28a" exitCode=0 Mar 12 12:46:02.484607 master-0 kubenswrapper[13984]: I0312 12:46:02.484529 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfdwn" event={"ID":"eb892f56-bc0f-4500-903c-49c41d16945b","Type":"ContainerDied","Data":"f9b9fe876e795dd1e21c83d1b1cecfb0938ee00a2d357e5a917ce8777963f28a"} Mar 12 12:46:02.489940 master-0 kubenswrapper[13984]: I0312 12:46:02.489836 13984 generic.go:334] "Generic (PLEG): container finished" podID="f2eefae0-724a-451b-b39b-753fa2551ba1" containerID="5ee0d474666ed7d476805f64bb18b2614147f7b3689af5a58eadab957f8be8a9" exitCode=0 Mar 12 12:46:02.490111 master-0 kubenswrapper[13984]: I0312 12:46:02.489989 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-16d8-account-create-update-lp6v6" event={"ID":"f2eefae0-724a-451b-b39b-753fa2551ba1","Type":"ContainerDied","Data":"5ee0d474666ed7d476805f64bb18b2614147f7b3689af5a58eadab957f8be8a9"} Mar 12 12:46:02.492681 master-0 kubenswrapper[13984]: I0312 12:46:02.492647 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zt2nr" event={"ID":"f56d283a-8eb6-4927-bf3c-3145cc96ee28","Type":"ContainerStarted","Data":"3f711da18448676cc41570fef4760423396bf0252c1cbadadd8cd053cbf0da4e"} Mar 12 12:46:02.499606 master-0 kubenswrapper[13984]: I0312 12:46:02.497169 13984 generic.go:334] "Generic (PLEG): container finished" podID="6567de78-8717-4214-8b7e-cd66e2ca2a24" containerID="374390468a0e8fd71fc21611c25faefd9e0e97b2a32de3d0dc1dbf75d597581e" exitCode=0 Mar 12 12:46:02.499606 master-0 kubenswrapper[13984]: I0312 12:46:02.497948 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cpjvq" event={"ID":"6567de78-8717-4214-8b7e-cd66e2ca2a24","Type":"ContainerDied","Data":"374390468a0e8fd71fc21611c25faefd9e0e97b2a32de3d0dc1dbf75d597581e"} Mar 12 12:46:02.555911 master-0 kubenswrapper[13984]: I0312 12:46:02.555800 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-zt2nr" podStartSLOduration=3.754603402 podStartE2EDuration="19.555775287s" podCreationTimestamp="2026-03-12 12:45:43 +0000 UTC" firstStartedPulling="2026-03-12 12:45:44.827167723 +0000 UTC m=+1277.025183215" lastFinishedPulling="2026-03-12 12:46:00.628339608 +0000 UTC m=+1292.826355100" observedRunningTime="2026-03-12 12:46:02.545105837 +0000 UTC m=+1294.743121349" watchObservedRunningTime="2026-03-12 12:46:02.555775287 +0000 UTC m=+1294.753790789" Mar 12 12:46:06.525596 master-0 kubenswrapper[13984]: I0312 12:46:06.525538 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfdwn" Mar 12 12:46:06.535111 master-0 kubenswrapper[13984]: I0312 12:46:06.535060 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:46:06.539147 master-0 kubenswrapper[13984]: I0312 12:46:06.539070 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf8gh\" (UniqueName: \"kubernetes.io/projected/eb892f56-bc0f-4500-903c-49c41d16945b-kube-api-access-nf8gh\") pod \"eb892f56-bc0f-4500-903c-49c41d16945b\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " Mar 12 12:46:06.539545 master-0 kubenswrapper[13984]: I0312 12:46:06.539470 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb892f56-bc0f-4500-903c-49c41d16945b-operator-scripts\") pod \"eb892f56-bc0f-4500-903c-49c41d16945b\" (UID: \"eb892f56-bc0f-4500-903c-49c41d16945b\") " Mar 12 12:46:06.540168 master-0 kubenswrapper[13984]: I0312 12:46:06.540130 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb892f56-bc0f-4500-903c-49c41d16945b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "eb892f56-bc0f-4500-903c-49c41d16945b" (UID: "eb892f56-bc0f-4500-903c-49c41d16945b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:06.542965 master-0 kubenswrapper[13984]: I0312 12:46:06.542906 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:46:06.543148 master-0 kubenswrapper[13984]: I0312 12:46:06.542986 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb892f56-bc0f-4500-903c-49c41d16945b-kube-api-access-nf8gh" (OuterVolumeSpecName: "kube-api-access-nf8gh") pod "eb892f56-bc0f-4500-903c-49c41d16945b" (UID: "eb892f56-bc0f-4500-903c-49c41d16945b"). InnerVolumeSpecName "kube-api-access-nf8gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:06.592738 master-0 kubenswrapper[13984]: I0312 12:46:06.592701 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cpjvq" Mar 12 12:46:06.610455 master-0 kubenswrapper[13984]: I0312 12:46:06.610390 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-16d8-account-create-update-lp6v6" event={"ID":"f2eefae0-724a-451b-b39b-753fa2551ba1","Type":"ContainerDied","Data":"215628fc1eb3d3ad647c0de892fe369ca6bc6761f7a9f18884bd53a0060a614e"} Mar 12 12:46:06.610455 master-0 kubenswrapper[13984]: I0312 12:46:06.610437 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="215628fc1eb3d3ad647c0de892fe369ca6bc6761f7a9f18884bd53a0060a614e" Mar 12 12:46:06.610455 master-0 kubenswrapper[13984]: I0312 12:46:06.610442 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-16d8-account-create-update-lp6v6" Mar 12 12:46:06.612456 master-0 kubenswrapper[13984]: I0312 12:46:06.612377 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-cpjvq" event={"ID":"6567de78-8717-4214-8b7e-cd66e2ca2a24","Type":"ContainerDied","Data":"cc467e7db2fba9ad7c76d4e1f8ff4569fdc2a21fdeb5ae9efeb1d9c2cc8089d2"} Mar 12 12:46:06.612554 master-0 kubenswrapper[13984]: I0312 12:46:06.612470 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc467e7db2fba9ad7c76d4e1f8ff4569fdc2a21fdeb5ae9efeb1d9c2cc8089d2" Mar 12 12:46:06.612554 master-0 kubenswrapper[13984]: I0312 12:46:06.612426 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-cpjvq" Mar 12 12:46:06.614212 master-0 kubenswrapper[13984]: I0312 12:46:06.614177 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-74a2-account-create-update-xbbbk" Mar 12 12:46:06.614212 master-0 kubenswrapper[13984]: I0312 12:46:06.614189 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-74a2-account-create-update-xbbbk" event={"ID":"6d81f389-a57f-4baa-9df7-21072a39b775","Type":"ContainerDied","Data":"1b8ca91ad43de5f49774975d7fbd1fdc01c1a7412c11092d39bf1b3718c56362"} Mar 12 12:46:06.614387 master-0 kubenswrapper[13984]: I0312 12:46:06.614225 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b8ca91ad43de5f49774975d7fbd1fdc01c1a7412c11092d39bf1b3718c56362" Mar 12 12:46:06.618457 master-0 kubenswrapper[13984]: I0312 12:46:06.615762 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-zfdwn" event={"ID":"eb892f56-bc0f-4500-903c-49c41d16945b","Type":"ContainerDied","Data":"9f876f490609103d90919ef2d40aee554c1912582ad0e84eaae81e89345470da"} Mar 12 12:46:06.618457 master-0 kubenswrapper[13984]: I0312 12:46:06.615794 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f876f490609103d90919ef2d40aee554c1912582ad0e84eaae81e89345470da" Mar 12 12:46:06.618457 master-0 kubenswrapper[13984]: I0312 12:46:06.615816 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-zfdwn" Mar 12 12:46:06.646984 master-0 kubenswrapper[13984]: I0312 12:46:06.643219 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2eefae0-724a-451b-b39b-753fa2551ba1-operator-scripts\") pod \"f2eefae0-724a-451b-b39b-753fa2551ba1\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " Mar 12 12:46:06.646984 master-0 kubenswrapper[13984]: I0312 12:46:06.645093 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2eefae0-724a-451b-b39b-753fa2551ba1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2eefae0-724a-451b-b39b-753fa2551ba1" (UID: "f2eefae0-724a-451b-b39b-753fa2551ba1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:06.669869 master-0 kubenswrapper[13984]: I0312 12:46:06.669715 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d81f389-a57f-4baa-9df7-21072a39b775-operator-scripts\") pod \"6d81f389-a57f-4baa-9df7-21072a39b775\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " Mar 12 12:46:06.669869 master-0 kubenswrapper[13984]: I0312 12:46:06.669857 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sbbq\" (UniqueName: \"kubernetes.io/projected/6567de78-8717-4214-8b7e-cd66e2ca2a24-kube-api-access-8sbbq\") pod \"6567de78-8717-4214-8b7e-cd66e2ca2a24\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " Mar 12 12:46:06.670130 master-0 kubenswrapper[13984]: I0312 12:46:06.669881 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjmdx\" (UniqueName: \"kubernetes.io/projected/f2eefae0-724a-451b-b39b-753fa2551ba1-kube-api-access-bjmdx\") pod \"f2eefae0-724a-451b-b39b-753fa2551ba1\" (UID: \"f2eefae0-724a-451b-b39b-753fa2551ba1\") " Mar 12 12:46:06.670130 master-0 kubenswrapper[13984]: I0312 12:46:06.669986 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6567de78-8717-4214-8b7e-cd66e2ca2a24-operator-scripts\") pod \"6567de78-8717-4214-8b7e-cd66e2ca2a24\" (UID: \"6567de78-8717-4214-8b7e-cd66e2ca2a24\") " Mar 12 12:46:06.670130 master-0 kubenswrapper[13984]: I0312 12:46:06.670032 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnq68\" (UniqueName: \"kubernetes.io/projected/6d81f389-a57f-4baa-9df7-21072a39b775-kube-api-access-cnq68\") pod \"6d81f389-a57f-4baa-9df7-21072a39b775\" (UID: \"6d81f389-a57f-4baa-9df7-21072a39b775\") " Mar 12 12:46:06.670268 master-0 kubenswrapper[13984]: I0312 12:46:06.670206 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d81f389-a57f-4baa-9df7-21072a39b775-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6d81f389-a57f-4baa-9df7-21072a39b775" (UID: "6d81f389-a57f-4baa-9df7-21072a39b775"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:06.670709 master-0 kubenswrapper[13984]: I0312 12:46:06.670648 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6567de78-8717-4214-8b7e-cd66e2ca2a24-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6567de78-8717-4214-8b7e-cd66e2ca2a24" (UID: "6567de78-8717-4214-8b7e-cd66e2ca2a24"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:06.670970 master-0 kubenswrapper[13984]: I0312 12:46:06.670932 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6d81f389-a57f-4baa-9df7-21072a39b775-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.670970 master-0 kubenswrapper[13984]: I0312 12:46:06.670966 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6567de78-8717-4214-8b7e-cd66e2ca2a24-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.671066 master-0 kubenswrapper[13984]: I0312 12:46:06.670986 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/eb892f56-bc0f-4500-903c-49c41d16945b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.671066 master-0 kubenswrapper[13984]: I0312 12:46:06.671006 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2eefae0-724a-451b-b39b-753fa2551ba1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.671066 master-0 kubenswrapper[13984]: I0312 12:46:06.671026 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf8gh\" (UniqueName: \"kubernetes.io/projected/eb892f56-bc0f-4500-903c-49c41d16945b-kube-api-access-nf8gh\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.686607 master-0 kubenswrapper[13984]: I0312 12:46:06.686472 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d81f389-a57f-4baa-9df7-21072a39b775-kube-api-access-cnq68" (OuterVolumeSpecName: "kube-api-access-cnq68") pod "6d81f389-a57f-4baa-9df7-21072a39b775" (UID: "6d81f389-a57f-4baa-9df7-21072a39b775"). InnerVolumeSpecName "kube-api-access-cnq68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:06.686607 master-0 kubenswrapper[13984]: I0312 12:46:06.686618 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6567de78-8717-4214-8b7e-cd66e2ca2a24-kube-api-access-8sbbq" (OuterVolumeSpecName: "kube-api-access-8sbbq") pod "6567de78-8717-4214-8b7e-cd66e2ca2a24" (UID: "6567de78-8717-4214-8b7e-cd66e2ca2a24"). InnerVolumeSpecName "kube-api-access-8sbbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:06.686925 master-0 kubenswrapper[13984]: I0312 12:46:06.686738 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2eefae0-724a-451b-b39b-753fa2551ba1-kube-api-access-bjmdx" (OuterVolumeSpecName: "kube-api-access-bjmdx") pod "f2eefae0-724a-451b-b39b-753fa2551ba1" (UID: "f2eefae0-724a-451b-b39b-753fa2551ba1"). InnerVolumeSpecName "kube-api-access-bjmdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:06.773999 master-0 kubenswrapper[13984]: I0312 12:46:06.773925 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnq68\" (UniqueName: \"kubernetes.io/projected/6d81f389-a57f-4baa-9df7-21072a39b775-kube-api-access-cnq68\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.773999 master-0 kubenswrapper[13984]: I0312 12:46:06.773987 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sbbq\" (UniqueName: \"kubernetes.io/projected/6567de78-8717-4214-8b7e-cd66e2ca2a24-kube-api-access-8sbbq\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:06.773999 master-0 kubenswrapper[13984]: I0312 12:46:06.774010 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjmdx\" (UniqueName: \"kubernetes.io/projected/f2eefae0-724a-451b-b39b-753fa2551ba1-kube-api-access-bjmdx\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:07.630508 master-0 kubenswrapper[13984]: I0312 12:46:07.630335 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6rxfx" event={"ID":"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6","Type":"ContainerStarted","Data":"92980a9dc5c656c7949aada98e855c3fd36acc24bf9b78c0bbeaaf8643326e67"} Mar 12 12:46:07.916620 master-0 kubenswrapper[13984]: I0312 12:46:07.916411 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6rxfx" podStartSLOduration=10.079598971 podStartE2EDuration="15.916394136s" podCreationTimestamp="2026-03-12 12:45:52 +0000 UTC" firstStartedPulling="2026-03-12 12:46:01.138668294 +0000 UTC m=+1293.336683786" lastFinishedPulling="2026-03-12 12:46:06.975463459 +0000 UTC m=+1299.173478951" observedRunningTime="2026-03-12 12:46:07.897555455 +0000 UTC m=+1300.095570957" watchObservedRunningTime="2026-03-12 12:46:07.916394136 +0000 UTC m=+1300.114409628" Mar 12 12:46:11.678575 master-0 kubenswrapper[13984]: I0312 12:46:11.678408 13984 generic.go:334] "Generic (PLEG): container finished" podID="e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" containerID="92980a9dc5c656c7949aada98e855c3fd36acc24bf9b78c0bbeaaf8643326e67" exitCode=0 Mar 12 12:46:11.678575 master-0 kubenswrapper[13984]: I0312 12:46:11.678462 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6rxfx" event={"ID":"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6","Type":"ContainerDied","Data":"92980a9dc5c656c7949aada98e855c3fd36acc24bf9b78c0bbeaaf8643326e67"} Mar 12 12:46:12.692273 master-0 kubenswrapper[13984]: I0312 12:46:12.692202 13984 generic.go:334] "Generic (PLEG): container finished" podID="f56d283a-8eb6-4927-bf3c-3145cc96ee28" containerID="3f711da18448676cc41570fef4760423396bf0252c1cbadadd8cd053cbf0da4e" exitCode=0 Mar 12 12:46:12.692841 master-0 kubenswrapper[13984]: I0312 12:46:12.692395 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zt2nr" event={"ID":"f56d283a-8eb6-4927-bf3c-3145cc96ee28","Type":"ContainerDied","Data":"3f711da18448676cc41570fef4760423396bf0252c1cbadadd8cd053cbf0da4e"} Mar 12 12:46:13.132848 master-0 kubenswrapper[13984]: I0312 12:46:13.132816 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:46:13.220308 master-0 kubenswrapper[13984]: I0312 12:46:13.220236 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrg5w\" (UniqueName: \"kubernetes.io/projected/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-kube-api-access-rrg5w\") pod \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " Mar 12 12:46:13.220581 master-0 kubenswrapper[13984]: I0312 12:46:13.220322 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-combined-ca-bundle\") pod \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " Mar 12 12:46:13.220581 master-0 kubenswrapper[13984]: I0312 12:46:13.220460 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-config-data\") pod \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\" (UID: \"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6\") " Mar 12 12:46:13.226373 master-0 kubenswrapper[13984]: I0312 12:46:13.226290 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-kube-api-access-rrg5w" (OuterVolumeSpecName: "kube-api-access-rrg5w") pod "e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" (UID: "e6e063ff-c1ba-4c47-ba35-c3ee0194dae6"). InnerVolumeSpecName "kube-api-access-rrg5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:13.255402 master-0 kubenswrapper[13984]: I0312 12:46:13.255293 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" (UID: "e6e063ff-c1ba-4c47-ba35-c3ee0194dae6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:13.280250 master-0 kubenswrapper[13984]: I0312 12:46:13.280193 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-config-data" (OuterVolumeSpecName: "config-data") pod "e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" (UID: "e6e063ff-c1ba-4c47-ba35-c3ee0194dae6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:13.323300 master-0 kubenswrapper[13984]: I0312 12:46:13.323237 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:13.323300 master-0 kubenswrapper[13984]: I0312 12:46:13.323298 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrg5w\" (UniqueName: \"kubernetes.io/projected/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-kube-api-access-rrg5w\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:13.323445 master-0 kubenswrapper[13984]: I0312 12:46:13.323314 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6e063ff-c1ba-4c47-ba35-c3ee0194dae6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:13.710588 master-0 kubenswrapper[13984]: I0312 12:46:13.710545 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6rxfx" Mar 12 12:46:13.715173 master-0 kubenswrapper[13984]: I0312 12:46:13.715137 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6rxfx" event={"ID":"e6e063ff-c1ba-4c47-ba35-c3ee0194dae6","Type":"ContainerDied","Data":"a899852a0cfc4329c3c96b0fde8062051e56168fda5fb8e13c692a095c6d5c8c"} Mar 12 12:46:13.715868 master-0 kubenswrapper[13984]: I0312 12:46:13.715343 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a899852a0cfc4329c3c96b0fde8062051e56168fda5fb8e13c692a095c6d5c8c" Mar 12 12:46:14.073530 master-0 kubenswrapper[13984]: I0312 12:46:14.072019 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c59dbd877-7mqwr"] Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096449 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2eefae0-724a-451b-b39b-753fa2551ba1" containerName="mariadb-account-create-update" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096551 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2eefae0-724a-451b-b39b-753fa2551ba1" containerName="mariadb-account-create-update" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096586 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerName="init" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096594 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerName="init" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096610 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb892f56-bc0f-4500-903c-49c41d16945b" containerName="mariadb-database-create" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096617 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb892f56-bc0f-4500-903c-49c41d16945b" containerName="mariadb-database-create" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096649 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="203e5cda-262b-4776-8012-99314b5df833" containerName="ovn-config" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096656 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="203e5cda-262b-4776-8012-99314b5df833" containerName="ovn-config" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096678 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerName="dnsmasq-dns" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096686 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerName="dnsmasq-dns" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096696 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" containerName="keystone-db-sync" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096703 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" containerName="keystone-db-sync" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096715 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d81f389-a57f-4baa-9df7-21072a39b775" containerName="mariadb-account-create-update" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096724 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d81f389-a57f-4baa-9df7-21072a39b775" containerName="mariadb-account-create-update" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: E0312 12:46:14.096757 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6567de78-8717-4214-8b7e-cd66e2ca2a24" containerName="mariadb-database-create" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.096765 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="6567de78-8717-4214-8b7e-cd66e2ca2a24" containerName="mariadb-database-create" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097098 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb892f56-bc0f-4500-903c-49c41d16945b" containerName="mariadb-database-create" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097125 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6e063ff-c1ba-4c47-ba35-c3ee0194dae6" containerName="keystone-db-sync" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097153 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="6567de78-8717-4214-8b7e-cd66e2ca2a24" containerName="mariadb-database-create" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097179 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9a07dca-acf8-4a07-82f7-e5ebe3ff8d4c" containerName="dnsmasq-dns" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097195 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="203e5cda-262b-4776-8012-99314b5df833" containerName="ovn-config" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097205 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d81f389-a57f-4baa-9df7-21072a39b775" containerName="mariadb-account-create-update" Mar 12 12:46:14.097675 master-0 kubenswrapper[13984]: I0312 12:46:14.097224 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2eefae0-724a-451b-b39b-753fa2551ba1" containerName="mariadb-account-create-update" Mar 12 12:46:14.098547 master-0 kubenswrapper[13984]: I0312 12:46:14.098496 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c59dbd877-7mqwr"] Mar 12 12:46:14.098781 master-0 kubenswrapper[13984]: I0312 12:46:14.098601 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.155965 master-0 kubenswrapper[13984]: I0312 12:46:14.152541 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-sb\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.155965 master-0 kubenswrapper[13984]: I0312 12:46:14.152644 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-config\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.155965 master-0 kubenswrapper[13984]: I0312 12:46:14.152702 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-swift-storage-0\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.155965 master-0 kubenswrapper[13984]: I0312 12:46:14.152729 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-svc\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.155965 master-0 kubenswrapper[13984]: I0312 12:46:14.152752 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-nb\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.155965 master-0 kubenswrapper[13984]: I0312 12:46:14.152785 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzx5g\" (UniqueName: \"kubernetes.io/projected/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-kube-api-access-dzx5g\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.176529 master-0 kubenswrapper[13984]: I0312 12:46:14.176266 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4767l"] Mar 12 12:46:14.185969 master-0 kubenswrapper[13984]: I0312 12:46:14.182455 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.193942 master-0 kubenswrapper[13984]: I0312 12:46:14.192510 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 12:46:14.193942 master-0 kubenswrapper[13984]: I0312 12:46:14.192700 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 12:46:14.193942 master-0 kubenswrapper[13984]: I0312 12:46:14.192841 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 12:46:14.193942 master-0 kubenswrapper[13984]: I0312 12:46:14.193006 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.257929 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-combined-ca-bundle\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258032 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-credential-keys\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258074 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-swift-storage-0\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258115 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-svc\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258155 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-nb\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258198 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzx5g\" (UniqueName: \"kubernetes.io/projected/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-kube-api-access-dzx5g\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258255 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-config-data\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258314 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-sb\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258349 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-scripts\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258384 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx8jz\" (UniqueName: \"kubernetes.io/projected/04c8b647-d290-4861-8709-6f5ce55141d3-kube-api-access-xx8jz\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258414 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-fernet-keys\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.258515 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-config\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.259679 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-config\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.260542 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-swift-storage-0\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.268503 master-0 kubenswrapper[13984]: I0312 12:46:14.267522 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-svc\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.284070 master-0 kubenswrapper[13984]: I0312 12:46:14.272725 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4767l"] Mar 12 12:46:14.284070 master-0 kubenswrapper[13984]: I0312 12:46:14.276573 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-sb\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.284070 master-0 kubenswrapper[13984]: I0312 12:46:14.283520 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-nb\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.357143 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzx5g\" (UniqueName: \"kubernetes.io/projected/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-kube-api-access-dzx5g\") pod \"dnsmasq-dns-c59dbd877-7mqwr\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.362459 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-combined-ca-bundle\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.362527 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-credential-keys\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.362597 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-config-data\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.362636 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-scripts\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.362654 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx8jz\" (UniqueName: \"kubernetes.io/projected/04c8b647-d290-4861-8709-6f5ce55141d3-kube-api-access-xx8jz\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.363510 master-0 kubenswrapper[13984]: I0312 12:46:14.362672 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-fernet-keys\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.367607 master-0 kubenswrapper[13984]: I0312 12:46:14.367571 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-fernet-keys\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.380618 master-0 kubenswrapper[13984]: I0312 12:46:14.371409 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-combined-ca-bundle\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.380618 master-0 kubenswrapper[13984]: I0312 12:46:14.373658 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-credential-keys\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.380618 master-0 kubenswrapper[13984]: I0312 12:46:14.377039 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-config-data\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.398579 master-0 kubenswrapper[13984]: I0312 12:46:14.393511 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-scripts\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.407509 master-0 kubenswrapper[13984]: I0312 12:46:14.402095 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-fswww"] Mar 12 12:46:14.407509 master-0 kubenswrapper[13984]: I0312 12:46:14.403570 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.444578 master-0 kubenswrapper[13984]: I0312 12:46:14.436986 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx8jz\" (UniqueName: \"kubernetes.io/projected/04c8b647-d290-4861-8709-6f5ce55141d3-kube-api-access-xx8jz\") pod \"keystone-bootstrap-4767l\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.444578 master-0 kubenswrapper[13984]: I0312 12:46:14.439550 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-fswww"] Mar 12 12:46:14.468501 master-0 kubenswrapper[13984]: I0312 12:46:14.468109 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:14.476657 master-0 kubenswrapper[13984]: I0312 12:46:14.469342 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjnf8\" (UniqueName: \"kubernetes.io/projected/149d48ba-e4d1-4684-ae20-f17d4e1f247f-kube-api-access-kjnf8\") pod \"ironic-db-create-fswww\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.476657 master-0 kubenswrapper[13984]: I0312 12:46:14.469447 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149d48ba-e4d1-4684-ae20-f17d4e1f247f-operator-scripts\") pod \"ironic-db-create-fswww\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.557884 master-0 kubenswrapper[13984]: I0312 12:46:14.545001 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:14.605816 master-0 kubenswrapper[13984]: I0312 12:46:14.602625 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjnf8\" (UniqueName: \"kubernetes.io/projected/149d48ba-e4d1-4684-ae20-f17d4e1f247f-kube-api-access-kjnf8\") pod \"ironic-db-create-fswww\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.605816 master-0 kubenswrapper[13984]: I0312 12:46:14.602753 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149d48ba-e4d1-4684-ae20-f17d4e1f247f-operator-scripts\") pod \"ironic-db-create-fswww\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.605816 master-0 kubenswrapper[13984]: I0312 12:46:14.603889 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149d48ba-e4d1-4684-ae20-f17d4e1f247f-operator-scripts\") pod \"ironic-db-create-fswww\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.670510 master-0 kubenswrapper[13984]: I0312 12:46:14.667285 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjnf8\" (UniqueName: \"kubernetes.io/projected/149d48ba-e4d1-4684-ae20-f17d4e1f247f-kube-api-access-kjnf8\") pod \"ironic-db-create-fswww\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.714506 master-0 kubenswrapper[13984]: I0312 12:46:14.709294 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-fswww" Mar 12 12:46:14.769501 master-0 kubenswrapper[13984]: I0312 12:46:14.766610 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-df3c-account-create-update-rbfjs"] Mar 12 12:46:14.769501 master-0 kubenswrapper[13984]: I0312 12:46:14.767952 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.821618 master-0 kubenswrapper[13984]: I0312 12:46:14.806941 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 12 12:46:14.821618 master-0 kubenswrapper[13984]: I0312 12:46:14.816976 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f92e833d-9f24-4783-a5d6-89a7fa102026-operator-scripts\") pod \"ironic-df3c-account-create-update-rbfjs\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.821618 master-0 kubenswrapper[13984]: I0312 12:46:14.817025 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ltqf\" (UniqueName: \"kubernetes.io/projected/f92e833d-9f24-4783-a5d6-89a7fa102026-kube-api-access-4ltqf\") pod \"ironic-df3c-account-create-update-rbfjs\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.850575 master-0 kubenswrapper[13984]: I0312 12:46:14.848633 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-df3c-account-create-update-rbfjs"] Mar 12 12:46:14.871897 master-0 kubenswrapper[13984]: I0312 12:46:14.871860 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zt2nr" Mar 12 12:46:14.872310 master-0 kubenswrapper[13984]: I0312 12:46:14.872291 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-dfnvv"] Mar 12 12:46:14.873066 master-0 kubenswrapper[13984]: E0312 12:46:14.872827 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f56d283a-8eb6-4927-bf3c-3145cc96ee28" containerName="glance-db-sync" Mar 12 12:46:14.873066 master-0 kubenswrapper[13984]: I0312 12:46:14.872848 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f56d283a-8eb6-4927-bf3c-3145cc96ee28" containerName="glance-db-sync" Mar 12 12:46:14.873177 master-0 kubenswrapper[13984]: I0312 12:46:14.873070 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f56d283a-8eb6-4927-bf3c-3145cc96ee28" containerName="glance-db-sync" Mar 12 12:46:14.873703 master-0 kubenswrapper[13984]: I0312 12:46:14.873686 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:14.876562 master-0 kubenswrapper[13984]: I0312 12:46:14.875552 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 12:46:14.879239 master-0 kubenswrapper[13984]: I0312 12:46:14.879198 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 12:46:14.923015 master-0 kubenswrapper[13984]: I0312 12:46:14.915754 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dfnvv"] Mar 12 12:46:14.923015 master-0 kubenswrapper[13984]: I0312 12:46:14.919544 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f92e833d-9f24-4783-a5d6-89a7fa102026-operator-scripts\") pod \"ironic-df3c-account-create-update-rbfjs\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.923015 master-0 kubenswrapper[13984]: I0312 12:46:14.920090 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ltqf\" (UniqueName: \"kubernetes.io/projected/f92e833d-9f24-4783-a5d6-89a7fa102026-kube-api-access-4ltqf\") pod \"ironic-df3c-account-create-update-rbfjs\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.923015 master-0 kubenswrapper[13984]: I0312 12:46:14.922863 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f92e833d-9f24-4783-a5d6-89a7fa102026-operator-scripts\") pod \"ironic-df3c-account-create-update-rbfjs\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.951168 master-0 kubenswrapper[13984]: I0312 12:46:14.950289 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-5pl65"] Mar 12 12:46:14.952156 master-0 kubenswrapper[13984]: I0312 12:46:14.952093 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:14.955467 master-0 kubenswrapper[13984]: I0312 12:46:14.954524 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 12:46:14.960073 master-0 kubenswrapper[13984]: I0312 12:46:14.959879 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 12:46:14.971418 master-0 kubenswrapper[13984]: I0312 12:46:14.971311 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ltqf\" (UniqueName: \"kubernetes.io/projected/f92e833d-9f24-4783-a5d6-89a7fa102026-kube-api-access-4ltqf\") pod \"ironic-df3c-account-create-update-rbfjs\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:14.984740 master-0 kubenswrapper[13984]: I0312 12:46:14.984692 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5pl65"] Mar 12 12:46:15.016940 master-0 kubenswrapper[13984]: I0312 12:46:15.016882 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c59dbd877-7mqwr"] Mar 12 12:46:15.021446 master-0 kubenswrapper[13984]: I0312 12:46:15.021358 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-config-data\") pod \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " Mar 12 12:46:15.021621 master-0 kubenswrapper[13984]: I0312 12:46:15.021563 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-combined-ca-bundle\") pod \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " Mar 12 12:46:15.022531 master-0 kubenswrapper[13984]: I0312 12:46:15.021692 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhgc6\" (UniqueName: \"kubernetes.io/projected/f56d283a-8eb6-4927-bf3c-3145cc96ee28-kube-api-access-fhgc6\") pod \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " Mar 12 12:46:15.022531 master-0 kubenswrapper[13984]: I0312 12:46:15.021793 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-db-sync-config-data\") pod \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\" (UID: \"f56d283a-8eb6-4927-bf3c-3145cc96ee28\") " Mar 12 12:46:15.022531 master-0 kubenswrapper[13984]: I0312 12:46:15.022187 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-combined-ca-bundle\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.022531 master-0 kubenswrapper[13984]: I0312 12:46:15.022252 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-config\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.022531 master-0 kubenswrapper[13984]: I0312 12:46:15.022362 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2lqs\" (UniqueName: \"kubernetes.io/projected/d2a33401-2a04-4b67-ad4f-702034d7a0a6-kube-api-access-v2lqs\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.028898 master-0 kubenswrapper[13984]: I0312 12:46:15.028762 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "f56d283a-8eb6-4927-bf3c-3145cc96ee28" (UID: "f56d283a-8eb6-4927-bf3c-3145cc96ee28"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:15.035024 master-0 kubenswrapper[13984]: I0312 12:46:15.034342 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f56d283a-8eb6-4927-bf3c-3145cc96ee28-kube-api-access-fhgc6" (OuterVolumeSpecName: "kube-api-access-fhgc6") pod "f56d283a-8eb6-4927-bf3c-3145cc96ee28" (UID: "f56d283a-8eb6-4927-bf3c-3145cc96ee28"). InnerVolumeSpecName "kube-api-access-fhgc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:15.060704 master-0 kubenswrapper[13984]: I0312 12:46:15.050219 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-db-sync-rcknr"] Mar 12 12:46:15.060704 master-0 kubenswrapper[13984]: I0312 12:46:15.052275 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.064447 master-0 kubenswrapper[13984]: I0312 12:46:15.062520 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-scripts" Mar 12 12:46:15.064447 master-0 kubenswrapper[13984]: I0312 12:46:15.062674 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-config-data" Mar 12 12:46:15.064447 master-0 kubenswrapper[13984]: I0312 12:46:15.064283 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-db-sync-rcknr"] Mar 12 12:46:15.106812 master-0 kubenswrapper[13984]: I0312 12:46:15.105915 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f56d283a-8eb6-4927-bf3c-3145cc96ee28" (UID: "f56d283a-8eb6-4927-bf3c-3145cc96ee28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.115898 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.124922 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2lqs\" (UniqueName: \"kubernetes.io/projected/d2a33401-2a04-4b67-ad4f-702034d7a0a6-kube-api-access-v2lqs\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.125034 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-scripts\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.125080 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8085b96-6908-4956-9e4c-290797974f87-logs\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.125139 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-combined-ca-bundle\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.125583 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-config-data\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.125651 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-combined-ca-bundle\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.125912 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67b7d74975-bfgsb"] Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.126065 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-config\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.126109 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-947cb\" (UniqueName: \"kubernetes.io/projected/c8085b96-6908-4956-9e4c-290797974f87-kube-api-access-947cb\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.126678 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.126702 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhgc6\" (UniqueName: \"kubernetes.io/projected/f56d283a-8eb6-4927-bf3c-3145cc96ee28-kube-api-access-fhgc6\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.126715 13984 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.127935 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.131949 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-combined-ca-bundle\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.115975 master-0 kubenswrapper[13984]: I0312 12:46:15.132166 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-config\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.144366 master-0 kubenswrapper[13984]: I0312 12:46:15.144246 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2lqs\" (UniqueName: \"kubernetes.io/projected/d2a33401-2a04-4b67-ad4f-702034d7a0a6-kube-api-access-v2lqs\") pod \"neutron-db-sync-dfnvv\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.168360 master-0 kubenswrapper[13984]: I0312 12:46:15.168272 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-config-data" (OuterVolumeSpecName: "config-data") pod "f56d283a-8eb6-4927-bf3c-3145cc96ee28" (UID: "f56d283a-8eb6-4927-bf3c-3145cc96ee28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.227155 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228033 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-db-sync-config-data\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228107 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-nb\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228153 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-scripts\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228196 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-sb\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228224 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8085b96-6908-4956-9e4c-290797974f87-logs\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228275 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-combined-ca-bundle\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228309 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-config-data\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228331 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-config\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228354 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c180ef3d-38aa-4843-b682-92e609e6d856-etc-machine-id\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228377 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-swift-storage-0\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228408 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-svc\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228455 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-947cb\" (UniqueName: \"kubernetes.io/projected/c8085b96-6908-4956-9e4c-290797974f87-kube-api-access-947cb\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228499 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-scripts\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228537 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-combined-ca-bundle\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228569 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rkw\" (UniqueName: \"kubernetes.io/projected/c180ef3d-38aa-4843-b682-92e609e6d856-kube-api-access-n4rkw\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228597 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9j95\" (UniqueName: \"kubernetes.io/projected/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-kube-api-access-b9j95\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228627 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-config-data\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.230517 master-0 kubenswrapper[13984]: I0312 12:46:15.228722 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f56d283a-8eb6-4927-bf3c-3145cc96ee28-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:15.232039 master-0 kubenswrapper[13984]: I0312 12:46:15.231975 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-scripts\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.232340 master-0 kubenswrapper[13984]: I0312 12:46:15.232307 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8085b96-6908-4956-9e4c-290797974f87-logs\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.240193 master-0 kubenswrapper[13984]: I0312 12:46:15.239834 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-combined-ca-bundle\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.247973 master-0 kubenswrapper[13984]: I0312 12:46:15.245161 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-config-data\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.283827 master-0 kubenswrapper[13984]: I0312 12:46:15.283590 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67b7d74975-bfgsb"] Mar 12 12:46:15.293227 master-0 kubenswrapper[13984]: I0312 12:46:15.290298 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-947cb\" (UniqueName: \"kubernetes.io/projected/c8085b96-6908-4956-9e4c-290797974f87-kube-api-access-947cb\") pod \"placement-db-sync-5pl65\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.330926 master-0 kubenswrapper[13984]: I0312 12:46:15.330783 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-config\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.330926 master-0 kubenswrapper[13984]: I0312 12:46:15.330865 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c180ef3d-38aa-4843-b682-92e609e6d856-etc-machine-id\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.330926 master-0 kubenswrapper[13984]: I0312 12:46:15.330897 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-swift-storage-0\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.330934 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-svc\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.330994 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-scripts\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.331034 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-combined-ca-bundle\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.331064 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rkw\" (UniqueName: \"kubernetes.io/projected/c180ef3d-38aa-4843-b682-92e609e6d856-kube-api-access-n4rkw\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.331090 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9j95\" (UniqueName: \"kubernetes.io/projected/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-kube-api-access-b9j95\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.331119 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-config-data\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.331221 master-0 kubenswrapper[13984]: I0312 12:46:15.331207 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-db-sync-config-data\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.331541 master-0 kubenswrapper[13984]: I0312 12:46:15.331252 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-nb\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.331541 master-0 kubenswrapper[13984]: I0312 12:46:15.331324 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-sb\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.332536 master-0 kubenswrapper[13984]: I0312 12:46:15.332467 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-sb\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.333156 master-0 kubenswrapper[13984]: I0312 12:46:15.333089 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-swift-storage-0\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.333923 master-0 kubenswrapper[13984]: I0312 12:46:15.333895 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-config\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.334008 master-0 kubenswrapper[13984]: I0312 12:46:15.333960 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c180ef3d-38aa-4843-b682-92e609e6d856-etc-machine-id\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.337601 master-0 kubenswrapper[13984]: I0312 12:46:15.337555 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-combined-ca-bundle\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.343776 master-0 kubenswrapper[13984]: I0312 12:46:15.341802 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-svc\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.354711 master-0 kubenswrapper[13984]: I0312 12:46:15.345286 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-nb\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.354711 master-0 kubenswrapper[13984]: I0312 12:46:15.351464 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-db-sync-config-data\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.360575 master-0 kubenswrapper[13984]: I0312 12:46:15.360531 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9j95\" (UniqueName: \"kubernetes.io/projected/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-kube-api-access-b9j95\") pod \"dnsmasq-dns-67b7d74975-bfgsb\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.376153 master-0 kubenswrapper[13984]: I0312 12:46:15.371854 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-config-data\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.376153 master-0 kubenswrapper[13984]: I0312 12:46:15.376073 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-scripts\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.376411 master-0 kubenswrapper[13984]: I0312 12:46:15.376166 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rkw\" (UniqueName: \"kubernetes.io/projected/c180ef3d-38aa-4843-b682-92e609e6d856-kube-api-access-n4rkw\") pod \"cinder-8c9c7-db-sync-rcknr\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.407839 master-0 kubenswrapper[13984]: I0312 12:46:15.406167 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:15.449961 master-0 kubenswrapper[13984]: I0312 12:46:15.447281 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c59dbd877-7mqwr"] Mar 12 12:46:15.449961 master-0 kubenswrapper[13984]: I0312 12:46:15.448509 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:15.497758 master-0 kubenswrapper[13984]: W0312 12:46:15.497669 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc87ca2bf_27f3_4cbd_94a7_9b7f6aca964f.slice/crio-3b8dbb8c9c3663ba3ff020816f8f827fa7040b3db345934bfc1eeca81c0cb2e5 WatchSource:0}: Error finding container 3b8dbb8c9c3663ba3ff020816f8f827fa7040b3db345934bfc1eeca81c0cb2e5: Status 404 returned error can't find the container with id 3b8dbb8c9c3663ba3ff020816f8f827fa7040b3db345934bfc1eeca81c0cb2e5 Mar 12 12:46:15.577257 master-0 kubenswrapper[13984]: I0312 12:46:15.577202 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:15.677108 master-0 kubenswrapper[13984]: I0312 12:46:15.677043 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4767l"] Mar 12 12:46:15.760714 master-0 kubenswrapper[13984]: I0312 12:46:15.756809 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-fswww"] Mar 12 12:46:15.860038 master-0 kubenswrapper[13984]: I0312 12:46:15.859975 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" event={"ID":"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f","Type":"ContainerStarted","Data":"3b8dbb8c9c3663ba3ff020816f8f827fa7040b3db345934bfc1eeca81c0cb2e5"} Mar 12 12:46:15.868325 master-0 kubenswrapper[13984]: I0312 12:46:15.868259 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-fswww" event={"ID":"149d48ba-e4d1-4684-ae20-f17d4e1f247f","Type":"ContainerStarted","Data":"39b65fdf2432f31e32c6c9178da3a8ae14ca6ded5b4866f7a1677e2f688b1412"} Mar 12 12:46:15.873341 master-0 kubenswrapper[13984]: I0312 12:46:15.873296 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-zt2nr" event={"ID":"f56d283a-8eb6-4927-bf3c-3145cc96ee28","Type":"ContainerDied","Data":"b2164e539568c2b4717d8a389f4e8294960f0def7920d6ddf5372d1bd95ae683"} Mar 12 12:46:15.873341 master-0 kubenswrapper[13984]: I0312 12:46:15.873342 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2164e539568c2b4717d8a389f4e8294960f0def7920d6ddf5372d1bd95ae683" Mar 12 12:46:15.873458 master-0 kubenswrapper[13984]: I0312 12:46:15.873416 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-zt2nr" Mar 12 12:46:15.875204 master-0 kubenswrapper[13984]: I0312 12:46:15.875154 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4767l" event={"ID":"04c8b647-d290-4861-8709-6f5ce55141d3","Type":"ContainerStarted","Data":"8473a43574fd6cbaac7dabaf2bc67a08e9b1df3bc7b9b32f27b5524aed878802"} Mar 12 12:46:15.999268 master-0 kubenswrapper[13984]: W0312 12:46:15.998564 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2a33401_2a04_4b67_ad4f_702034d7a0a6.slice/crio-ed88a900d1f45b8fa141c42ffed42a89b0963beb81ed1c1fd3fcc1f326c15df0 WatchSource:0}: Error finding container ed88a900d1f45b8fa141c42ffed42a89b0963beb81ed1c1fd3fcc1f326c15df0: Status 404 returned error can't find the container with id ed88a900d1f45b8fa141c42ffed42a89b0963beb81ed1c1fd3fcc1f326c15df0 Mar 12 12:46:15.999614 master-0 kubenswrapper[13984]: I0312 12:46:15.999529 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-dfnvv"] Mar 12 12:46:16.425198 master-0 kubenswrapper[13984]: I0312 12:46:16.424586 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b7d74975-bfgsb"] Mar 12 12:46:16.541045 master-0 kubenswrapper[13984]: W0312 12:46:16.540996 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc180ef3d_38aa_4843_b682_92e609e6d856.slice/crio-960a460dc98ea51aaa42d8a75b7846865b7660301833ea5b615d33f2040563bb WatchSource:0}: Error finding container 960a460dc98ea51aaa42d8a75b7846865b7660301833ea5b615d33f2040563bb: Status 404 returned error can't find the container with id 960a460dc98ea51aaa42d8a75b7846865b7660301833ea5b615d33f2040563bb Mar 12 12:46:16.549806 master-0 kubenswrapper[13984]: I0312 12:46:16.545638 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-678f7c7469-hf7k6"] Mar 12 12:46:16.549806 master-0 kubenswrapper[13984]: I0312 12:46:16.548031 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.595792 master-0 kubenswrapper[13984]: I0312 12:46:16.594005 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-df3c-account-create-update-rbfjs"] Mar 12 12:46:16.668975 master-0 kubenswrapper[13984]: I0312 12:46:16.667534 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b7d74975-bfgsb"] Mar 12 12:46:16.697408 master-0 kubenswrapper[13984]: I0312 12:46:16.692608 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-sb\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.697408 master-0 kubenswrapper[13984]: I0312 12:46:16.692706 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-nb\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.697408 master-0 kubenswrapper[13984]: I0312 12:46:16.692810 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-svc\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.697408 master-0 kubenswrapper[13984]: I0312 12:46:16.692859 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hq4\" (UniqueName: \"kubernetes.io/projected/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-kube-api-access-49hq4\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.697408 master-0 kubenswrapper[13984]: I0312 12:46:16.692890 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-swift-storage-0\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.697408 master-0 kubenswrapper[13984]: I0312 12:46:16.693022 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-config\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.722884 master-0 kubenswrapper[13984]: I0312 12:46:16.718233 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-678f7c7469-hf7k6"] Mar 12 12:46:16.737837 master-0 kubenswrapper[13984]: I0312 12:46:16.737623 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-db-sync-rcknr"] Mar 12 12:46:16.769847 master-0 kubenswrapper[13984]: I0312 12:46:16.769783 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-5pl65"] Mar 12 12:46:16.806500 master-0 kubenswrapper[13984]: I0312 12:46:16.806432 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-svc\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.806711 master-0 kubenswrapper[13984]: I0312 12:46:16.806580 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hq4\" (UniqueName: \"kubernetes.io/projected/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-kube-api-access-49hq4\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.806711 master-0 kubenswrapper[13984]: I0312 12:46:16.806639 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-swift-storage-0\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.806808 master-0 kubenswrapper[13984]: I0312 12:46:16.806718 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-config\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.806808 master-0 kubenswrapper[13984]: I0312 12:46:16.806781 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-sb\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.807286 master-0 kubenswrapper[13984]: I0312 12:46:16.806862 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-nb\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.808217 master-0 kubenswrapper[13984]: I0312 12:46:16.808178 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-nb\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.809151 master-0 kubenswrapper[13984]: I0312 12:46:16.809114 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-svc\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.810329 master-0 kubenswrapper[13984]: I0312 12:46:16.810299 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-swift-storage-0\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.811429 master-0 kubenswrapper[13984]: I0312 12:46:16.811391 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-config\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.811524 master-0 kubenswrapper[13984]: I0312 12:46:16.811430 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-sb\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.826533 master-0 kubenswrapper[13984]: I0312 12:46:16.826461 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hq4\" (UniqueName: \"kubernetes.io/projected/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-kube-api-access-49hq4\") pod \"dnsmasq-dns-678f7c7469-hf7k6\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:16.922499 master-0 kubenswrapper[13984]: I0312 12:46:16.921225 13984 generic.go:334] "Generic (PLEG): container finished" podID="c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" containerID="afdeba0b055ac1a728b6da3766372f9fa89d883469cb3b5b789e53d7a3d5d523" exitCode=0 Mar 12 12:46:16.922499 master-0 kubenswrapper[13984]: I0312 12:46:16.921296 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" event={"ID":"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f","Type":"ContainerDied","Data":"afdeba0b055ac1a728b6da3766372f9fa89d883469cb3b5b789e53d7a3d5d523"} Mar 12 12:46:16.934071 master-0 kubenswrapper[13984]: I0312 12:46:16.933989 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-db-sync-rcknr" event={"ID":"c180ef3d-38aa-4843-b682-92e609e6d856","Type":"ContainerStarted","Data":"960a460dc98ea51aaa42d8a75b7846865b7660301833ea5b615d33f2040563bb"} Mar 12 12:46:16.954530 master-0 kubenswrapper[13984]: I0312 12:46:16.949855 13984 generic.go:334] "Generic (PLEG): container finished" podID="149d48ba-e4d1-4684-ae20-f17d4e1f247f" containerID="cc6aebc45fb5cb80c969172c1dbef589ec282cc5156656a5178c9e733ca71ecf" exitCode=0 Mar 12 12:46:16.954530 master-0 kubenswrapper[13984]: I0312 12:46:16.949933 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-fswww" event={"ID":"149d48ba-e4d1-4684-ae20-f17d4e1f247f","Type":"ContainerDied","Data":"cc6aebc45fb5cb80c969172c1dbef589ec282cc5156656a5178c9e733ca71ecf"} Mar 12 12:46:16.954530 master-0 kubenswrapper[13984]: E0312 12:46:16.951236 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod149d48ba_e4d1_4684_ae20_f17d4e1f247f.slice/crio-conmon-cc6aebc45fb5cb80c969172c1dbef589ec282cc5156656a5178c9e733ca71ecf.scope\": RecentStats: unable to find data in memory cache]" Mar 12 12:46:16.964913 master-0 kubenswrapper[13984]: I0312 12:46:16.964443 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-df3c-account-create-update-rbfjs" event={"ID":"f92e833d-9f24-4783-a5d6-89a7fa102026","Type":"ContainerStarted","Data":"3e7ac0ab55488d4c3f57a58473ef3c3082cd98b60335fc4b9c79ee88a1bd2c1a"} Mar 12 12:46:16.964913 master-0 kubenswrapper[13984]: I0312 12:46:16.964828 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-df3c-account-create-update-rbfjs" event={"ID":"f92e833d-9f24-4783-a5d6-89a7fa102026","Type":"ContainerStarted","Data":"7f4b50056b5df0259a1f3a4ddbe67b57b7f0dfd8cf8239d65d3e945a68c53df3"} Mar 12 12:46:16.975561 master-0 kubenswrapper[13984]: I0312 12:46:16.975507 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" event={"ID":"7907ad88-3c30-4b34-ac0d-6ac0106a75d1","Type":"ContainerStarted","Data":"9150832b9ba176e5f6f92c225ff8c8093a22b776b5246371e48f30e5de0ca3f0"} Mar 12 12:46:16.983304 master-0 kubenswrapper[13984]: I0312 12:46:16.983249 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dfnvv" event={"ID":"d2a33401-2a04-4b67-ad4f-702034d7a0a6","Type":"ContainerStarted","Data":"eae2a9932d934690d2763d8202034296c040081ae3224dcceab9627695765370"} Mar 12 12:46:16.983584 master-0 kubenswrapper[13984]: I0312 12:46:16.983562 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dfnvv" event={"ID":"d2a33401-2a04-4b67-ad4f-702034d7a0a6","Type":"ContainerStarted","Data":"ed88a900d1f45b8fa141c42ffed42a89b0963beb81ed1c1fd3fcc1f326c15df0"} Mar 12 12:46:16.994921 master-0 kubenswrapper[13984]: I0312 12:46:16.991961 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5pl65" event={"ID":"c8085b96-6908-4956-9e4c-290797974f87","Type":"ContainerStarted","Data":"c6dd281ef0960bbb187a9bb02d0fb3f242851b4396175a946c3682cdb441e9cf"} Mar 12 12:46:16.995180 master-0 kubenswrapper[13984]: I0312 12:46:16.995137 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:17.018035 master-0 kubenswrapper[13984]: I0312 12:46:17.008224 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4767l" event={"ID":"04c8b647-d290-4861-8709-6f5ce55141d3","Type":"ContainerStarted","Data":"2cd511b3f5082cee0c8dab442e8c47fbf12993f38966dcf97f415e18b360a966"} Mar 12 12:46:17.021224 master-0 kubenswrapper[13984]: I0312 12:46:17.020496 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-df3c-account-create-update-rbfjs" podStartSLOduration=3.020459345 podStartE2EDuration="3.020459345s" podCreationTimestamp="2026-03-12 12:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:17.005531935 +0000 UTC m=+1309.203547427" watchObservedRunningTime="2026-03-12 12:46:17.020459345 +0000 UTC m=+1309.218474837" Mar 12 12:46:17.062249 master-0 kubenswrapper[13984]: I0312 12:46:17.044423 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-dfnvv" podStartSLOduration=3.044403536 podStartE2EDuration="3.044403536s" podCreationTimestamp="2026-03-12 12:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:17.029676441 +0000 UTC m=+1309.227691943" watchObservedRunningTime="2026-03-12 12:46:17.044403536 +0000 UTC m=+1309.242419028" Mar 12 12:46:17.080427 master-0 kubenswrapper[13984]: I0312 12:46:17.080238 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4767l" podStartSLOduration=3.080216974 podStartE2EDuration="3.080216974s" podCreationTimestamp="2026-03-12 12:46:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:17.078256338 +0000 UTC m=+1309.276271830" watchObservedRunningTime="2026-03-12 12:46:17.080216974 +0000 UTC m=+1309.278232466" Mar 12 12:46:17.665389 master-0 kubenswrapper[13984]: I0312 12:46:17.665344 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:17.749532 master-0 kubenswrapper[13984]: I0312 12:46:17.749426 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-nb\") pod \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " Mar 12 12:46:17.749777 master-0 kubenswrapper[13984]: I0312 12:46:17.749604 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-swift-storage-0\") pod \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " Mar 12 12:46:17.749777 master-0 kubenswrapper[13984]: I0312 12:46:17.749656 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-svc\") pod \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " Mar 12 12:46:17.750325 master-0 kubenswrapper[13984]: I0312 12:46:17.750300 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzx5g\" (UniqueName: \"kubernetes.io/projected/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-kube-api-access-dzx5g\") pod \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " Mar 12 12:46:17.750374 master-0 kubenswrapper[13984]: I0312 12:46:17.750322 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-678f7c7469-hf7k6"] Mar 12 12:46:17.750374 master-0 kubenswrapper[13984]: I0312 12:46:17.750357 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-sb\") pod \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " Mar 12 12:46:17.750604 master-0 kubenswrapper[13984]: I0312 12:46:17.750585 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-config\") pod \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\" (UID: \"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f\") " Mar 12 12:46:17.763649 master-0 kubenswrapper[13984]: I0312 12:46:17.763600 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-kube-api-access-dzx5g" (OuterVolumeSpecName: "kube-api-access-dzx5g") pod "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" (UID: "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f"). InnerVolumeSpecName "kube-api-access-dzx5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:17.779346 master-0 kubenswrapper[13984]: I0312 12:46:17.779285 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-config" (OuterVolumeSpecName: "config") pod "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" (UID: "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:17.785994 master-0 kubenswrapper[13984]: I0312 12:46:17.785925 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" (UID: "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:17.804173 master-0 kubenswrapper[13984]: I0312 12:46:17.804097 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" (UID: "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:17.810607 master-0 kubenswrapper[13984]: I0312 12:46:17.809762 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" (UID: "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:17.829579 master-0 kubenswrapper[13984]: I0312 12:46:17.828131 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" (UID: "c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:17.853115 master-0 kubenswrapper[13984]: I0312 12:46:17.852554 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:17.853115 master-0 kubenswrapper[13984]: I0312 12:46:17.852601 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:17.853115 master-0 kubenswrapper[13984]: I0312 12:46:17.852613 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:17.853115 master-0 kubenswrapper[13984]: I0312 12:46:17.852624 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:17.853115 master-0 kubenswrapper[13984]: I0312 12:46:17.852633 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dzx5g\" (UniqueName: \"kubernetes.io/projected/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-kube-api-access-dzx5g\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:17.853115 master-0 kubenswrapper[13984]: I0312 12:46:17.852642 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:18.040672 master-0 kubenswrapper[13984]: I0312 12:46:18.039021 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" event={"ID":"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118","Type":"ContainerStarted","Data":"9cf24830755bb46d1234bf2d28fb38969322c5ce5bebc5d9c66bcf6f05b46a40"} Mar 12 12:46:18.040672 master-0 kubenswrapper[13984]: I0312 12:46:18.039082 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" event={"ID":"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118","Type":"ContainerStarted","Data":"24dc8393d60a4f4241937f837b4789dd0eb63473193be9f70275159ebbce7e86"} Mar 12 12:46:18.041061 master-0 kubenswrapper[13984]: I0312 12:46:18.041028 13984 generic.go:334] "Generic (PLEG): container finished" podID="f92e833d-9f24-4783-a5d6-89a7fa102026" containerID="3e7ac0ab55488d4c3f57a58473ef3c3082cd98b60335fc4b9c79ee88a1bd2c1a" exitCode=0 Mar 12 12:46:18.041566 master-0 kubenswrapper[13984]: I0312 12:46:18.041095 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-df3c-account-create-update-rbfjs" event={"ID":"f92e833d-9f24-4783-a5d6-89a7fa102026","Type":"ContainerDied","Data":"3e7ac0ab55488d4c3f57a58473ef3c3082cd98b60335fc4b9c79ee88a1bd2c1a"} Mar 12 12:46:18.049107 master-0 kubenswrapper[13984]: I0312 12:46:18.049056 13984 generic.go:334] "Generic (PLEG): container finished" podID="7907ad88-3c30-4b34-ac0d-6ac0106a75d1" containerID="1ef1701ec700b5eba71d3e307127f594c3e48b58355947e8c9e6a48e153d3dd1" exitCode=0 Mar 12 12:46:18.049203 master-0 kubenswrapper[13984]: I0312 12:46:18.049156 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" event={"ID":"7907ad88-3c30-4b34-ac0d-6ac0106a75d1","Type":"ContainerDied","Data":"1ef1701ec700b5eba71d3e307127f594c3e48b58355947e8c9e6a48e153d3dd1"} Mar 12 12:46:18.068981 master-0 kubenswrapper[13984]: I0312 12:46:18.068101 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" Mar 12 12:46:18.069230 master-0 kubenswrapper[13984]: I0312 12:46:18.069203 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c59dbd877-7mqwr" event={"ID":"c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f","Type":"ContainerDied","Data":"3b8dbb8c9c3663ba3ff020816f8f827fa7040b3db345934bfc1eeca81c0cb2e5"} Mar 12 12:46:18.069306 master-0 kubenswrapper[13984]: I0312 12:46:18.069238 13984 scope.go:117] "RemoveContainer" containerID="afdeba0b055ac1a728b6da3766372f9fa89d883469cb3b5b789e53d7a3d5d523" Mar 12 12:46:18.213041 master-0 kubenswrapper[13984]: I0312 12:46:18.211543 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c59dbd877-7mqwr"] Mar 12 12:46:18.218907 master-0 kubenswrapper[13984]: I0312 12:46:18.218860 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c59dbd877-7mqwr"] Mar 12 12:46:18.634459 master-0 kubenswrapper[13984]: I0312 12:46:18.633940 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:18.634459 master-0 kubenswrapper[13984]: E0312 12:46:18.634453 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" containerName="init" Mar 12 12:46:18.634459 master-0 kubenswrapper[13984]: I0312 12:46:18.634465 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" containerName="init" Mar 12 12:46:18.634829 master-0 kubenswrapper[13984]: I0312 12:46:18.634788 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" containerName="init" Mar 12 12:46:18.635862 master-0 kubenswrapper[13984]: I0312 12:46:18.635835 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.653043 master-0 kubenswrapper[13984]: I0312 12:46:18.640491 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 12 12:46:18.653043 master-0 kubenswrapper[13984]: I0312 12:46:18.640746 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-default-external-config-data" Mar 12 12:46:18.688836 master-0 kubenswrapper[13984]: I0312 12:46:18.679877 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:18.706043 master-0 kubenswrapper[13984]: I0312 12:46:18.705962 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwkps\" (UniqueName: \"kubernetes.io/projected/1aa2287a-21d8-417d-85a7-49baade8ffd8-kube-api-access-qwkps\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.706374 master-0 kubenswrapper[13984]: I0312 12:46:18.706347 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.707745 master-0 kubenswrapper[13984]: I0312 12:46:18.707715 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.707939 master-0 kubenswrapper[13984]: I0312 12:46:18.707919 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.714911 master-0 kubenswrapper[13984]: I0312 12:46:18.708697 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.716397 master-0 kubenswrapper[13984]: I0312 12:46:18.715344 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.716735 master-0 kubenswrapper[13984]: I0312 12:46:18.716706 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.824689 master-0 kubenswrapper[13984]: I0312 12:46:18.824628 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.824708 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.824786 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwkps\" (UniqueName: \"kubernetes.io/projected/1aa2287a-21d8-417d-85a7-49baade8ffd8-kube-api-access-qwkps\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.824809 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.824867 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.824899 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.824941 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.826172 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.833807 master-0 kubenswrapper[13984]: I0312 12:46:18.827291 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.834628 master-0 kubenswrapper[13984]: I0312 12:46:18.834586 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.836157 master-0 kubenswrapper[13984]: I0312 12:46:18.836131 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:46:18.836218 master-0 kubenswrapper[13984]: I0312 12:46:18.836160 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/705cac2184d649d44bc54d9fe6c523322613cb4b44faad7af7e5abd2b4c3196c/globalmount\"" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.846595 master-0 kubenswrapper[13984]: I0312 12:46:18.844368 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.846595 master-0 kubenswrapper[13984]: I0312 12:46:18.846009 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwkps\" (UniqueName: \"kubernetes.io/projected/1aa2287a-21d8-417d-85a7-49baade8ffd8-kube-api-access-qwkps\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.849908 master-0 kubenswrapper[13984]: I0312 12:46:18.849870 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:18.994032 master-0 kubenswrapper[13984]: I0312 12:46:18.993984 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:19.000189 master-0 kubenswrapper[13984]: I0312 12:46:18.998154 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-fswww" Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.138623 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-svc\") pod \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.138660 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjnf8\" (UniqueName: \"kubernetes.io/projected/149d48ba-e4d1-4684-ae20-f17d4e1f247f-kube-api-access-kjnf8\") pod \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.138858 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-sb\") pod \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.138921 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-config\") pod \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.139024 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-nb\") pod \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.139047 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9j95\" (UniqueName: \"kubernetes.io/projected/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-kube-api-access-b9j95\") pod \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.139074 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-swift-storage-0\") pod \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\" (UID: \"7907ad88-3c30-4b34-ac0d-6ac0106a75d1\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.139098 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149d48ba-e4d1-4684-ae20-f17d4e1f247f-operator-scripts\") pod \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\" (UID: \"149d48ba-e4d1-4684-ae20-f17d4e1f247f\") " Mar 12 12:46:19.140577 master-0 kubenswrapper[13984]: I0312 12:46:19.140434 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/149d48ba-e4d1-4684-ae20-f17d4e1f247f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "149d48ba-e4d1-4684-ae20-f17d4e1f247f" (UID: "149d48ba-e4d1-4684-ae20-f17d4e1f247f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:19.146506 master-0 kubenswrapper[13984]: I0312 12:46:19.143283 13984 generic.go:334] "Generic (PLEG): container finished" podID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerID="9cf24830755bb46d1234bf2d28fb38969322c5ce5bebc5d9c66bcf6f05b46a40" exitCode=0 Mar 12 12:46:19.146506 master-0 kubenswrapper[13984]: I0312 12:46:19.143340 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" event={"ID":"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118","Type":"ContainerDied","Data":"9cf24830755bb46d1234bf2d28fb38969322c5ce5bebc5d9c66bcf6f05b46a40"} Mar 12 12:46:19.152505 master-0 kubenswrapper[13984]: I0312 12:46:19.147557 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149d48ba-e4d1-4684-ae20-f17d4e1f247f-kube-api-access-kjnf8" (OuterVolumeSpecName: "kube-api-access-kjnf8") pod "149d48ba-e4d1-4684-ae20-f17d4e1f247f" (UID: "149d48ba-e4d1-4684-ae20-f17d4e1f247f"). InnerVolumeSpecName "kube-api-access-kjnf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:19.152505 master-0 kubenswrapper[13984]: I0312 12:46:19.147647 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-kube-api-access-b9j95" (OuterVolumeSpecName: "kube-api-access-b9j95") pod "7907ad88-3c30-4b34-ac0d-6ac0106a75d1" (UID: "7907ad88-3c30-4b34-ac0d-6ac0106a75d1"). InnerVolumeSpecName "kube-api-access-b9j95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:19.152505 master-0 kubenswrapper[13984]: I0312 12:46:19.149184 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-fswww" event={"ID":"149d48ba-e4d1-4684-ae20-f17d4e1f247f","Type":"ContainerDied","Data":"39b65fdf2432f31e32c6c9178da3a8ae14ca6ded5b4866f7a1677e2f688b1412"} Mar 12 12:46:19.152505 master-0 kubenswrapper[13984]: I0312 12:46:19.149245 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b65fdf2432f31e32c6c9178da3a8ae14ca6ded5b4866f7a1677e2f688b1412" Mar 12 12:46:19.152505 master-0 kubenswrapper[13984]: I0312 12:46:19.149380 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-fswww" Mar 12 12:46:19.162510 master-0 kubenswrapper[13984]: I0312 12:46:19.156947 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" event={"ID":"7907ad88-3c30-4b34-ac0d-6ac0106a75d1","Type":"ContainerDied","Data":"9150832b9ba176e5f6f92c225ff8c8093a22b776b5246371e48f30e5de0ca3f0"} Mar 12 12:46:19.162510 master-0 kubenswrapper[13984]: I0312 12:46:19.156959 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67b7d74975-bfgsb" Mar 12 12:46:19.162510 master-0 kubenswrapper[13984]: I0312 12:46:19.157016 13984 scope.go:117] "RemoveContainer" containerID="1ef1701ec700b5eba71d3e307127f594c3e48b58355947e8c9e6a48e153d3dd1" Mar 12 12:46:19.188211 master-0 kubenswrapper[13984]: I0312 12:46:19.185171 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7907ad88-3c30-4b34-ac0d-6ac0106a75d1" (UID: "7907ad88-3c30-4b34-ac0d-6ac0106a75d1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:19.226932 master-0 kubenswrapper[13984]: I0312 12:46:19.226863 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7907ad88-3c30-4b34-ac0d-6ac0106a75d1" (UID: "7907ad88-3c30-4b34-ac0d-6ac0106a75d1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:19.227616 master-0 kubenswrapper[13984]: I0312 12:46:19.227078 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7907ad88-3c30-4b34-ac0d-6ac0106a75d1" (UID: "7907ad88-3c30-4b34-ac0d-6ac0106a75d1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:19.231783 master-0 kubenswrapper[13984]: I0312 12:46:19.231738 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7907ad88-3c30-4b34-ac0d-6ac0106a75d1" (UID: "7907ad88-3c30-4b34-ac0d-6ac0106a75d1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:19.233016 master-0 kubenswrapper[13984]: I0312 12:46:19.232963 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-config" (OuterVolumeSpecName: "config") pod "7907ad88-3c30-4b34-ac0d-6ac0106a75d1" (UID: "7907ad88-3c30-4b34-ac0d-6ac0106a75d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244593 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244639 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244653 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9j95\" (UniqueName: \"kubernetes.io/projected/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-kube-api-access-b9j95\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244672 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244682 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/149d48ba-e4d1-4684-ae20-f17d4e1f247f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244691 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244700 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjnf8\" (UniqueName: \"kubernetes.io/projected/149d48ba-e4d1-4684-ae20-f17d4e1f247f-kube-api-access-kjnf8\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.244923 master-0 kubenswrapper[13984]: I0312 12:46:19.244710 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7907ad88-3c30-4b34-ac0d-6ac0106a75d1-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:19.778549 master-0 kubenswrapper[13984]: I0312 12:46:19.773399 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:19.784950 master-0 kubenswrapper[13984]: E0312 12:46:19.779321 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149d48ba-e4d1-4684-ae20-f17d4e1f247f" containerName="mariadb-database-create" Mar 12 12:46:19.785194 master-0 kubenswrapper[13984]: I0312 12:46:19.785165 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="149d48ba-e4d1-4684-ae20-f17d4e1f247f" containerName="mariadb-database-create" Mar 12 12:46:19.785302 master-0 kubenswrapper[13984]: E0312 12:46:19.785288 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7907ad88-3c30-4b34-ac0d-6ac0106a75d1" containerName="init" Mar 12 12:46:19.785365 master-0 kubenswrapper[13984]: I0312 12:46:19.785356 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7907ad88-3c30-4b34-ac0d-6ac0106a75d1" containerName="init" Mar 12 12:46:19.785838 master-0 kubenswrapper[13984]: I0312 12:46:19.785824 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="149d48ba-e4d1-4684-ae20-f17d4e1f247f" containerName="mariadb-database-create" Mar 12 12:46:19.785915 master-0 kubenswrapper[13984]: I0312 12:46:19.785905 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="7907ad88-3c30-4b34-ac0d-6ac0106a75d1" containerName="init" Mar 12 12:46:19.787078 master-0 kubenswrapper[13984]: I0312 12:46:19.787059 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.810465 master-0 kubenswrapper[13984]: I0312 12:46:19.800332 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-default-internal-config-data" Mar 12 12:46:19.860426 master-0 kubenswrapper[13984]: I0312 12:46:19.859384 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:19.870014 master-0 kubenswrapper[13984]: I0312 12:46:19.866698 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67b7d74975-bfgsb"] Mar 12 12:46:19.886161 master-0 kubenswrapper[13984]: I0312 12:46:19.886106 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67b7d74975-bfgsb"] Mar 12 12:46:19.970433 master-0 kubenswrapper[13984]: I0312 12:46:19.970385 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.970633 master-0 kubenswrapper[13984]: I0312 12:46:19.970575 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.970633 master-0 kubenswrapper[13984]: I0312 12:46:19.970604 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.970633 master-0 kubenswrapper[13984]: I0312 12:46:19.970629 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.970791 master-0 kubenswrapper[13984]: I0312 12:46:19.970653 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.970791 master-0 kubenswrapper[13984]: I0312 12:46:19.970673 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7js49\" (UniqueName: \"kubernetes.io/projected/7cef203b-e921-4159-a618-9e20584a7fc8-kube-api-access-7js49\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:19.970791 master-0 kubenswrapper[13984]: I0312 12:46:19.970732 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.049505 master-0 kubenswrapper[13984]: I0312 12:46:20.031185 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7907ad88-3c30-4b34-ac0d-6ac0106a75d1" path="/var/lib/kubelet/pods/7907ad88-3c30-4b34-ac0d-6ac0106a75d1/volumes" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.072459 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f" path="/var/lib/kubelet/pods/c87ca2bf-27f3-4cbd-94a7-9b7f6aca964f/volumes" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.072811 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.072980 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.073003 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.073023 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.073040 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.073060 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7js49\" (UniqueName: \"kubernetes.io/projected/7cef203b-e921-4159-a618-9e20584a7fc8-kube-api-access-7js49\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.073111 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.073644 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.075593 master-0 kubenswrapper[13984]: I0312 12:46:20.075388 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.094268 master-0 kubenswrapper[13984]: I0312 12:46:20.092358 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.108337 master-0 kubenswrapper[13984]: I0312 12:46:20.096153 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:46:20.108337 master-0 kubenswrapper[13984]: I0312 12:46:20.096203 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/95c4692dce0b9808ca42bee34ff116315d549daf622dbdf12b07464f0e4f3ac4/globalmount\"" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.108337 master-0 kubenswrapper[13984]: I0312 12:46:20.096510 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.108337 master-0 kubenswrapper[13984]: I0312 12:46:20.097145 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.113543 master-0 kubenswrapper[13984]: I0312 12:46:20.110243 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7js49\" (UniqueName: \"kubernetes.io/projected/7cef203b-e921-4159-a618-9e20584a7fc8-kube-api-access-7js49\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:20.175634 master-0 kubenswrapper[13984]: I0312 12:46:20.163615 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:20.191546 master-0 kubenswrapper[13984]: I0312 12:46:20.191225 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" event={"ID":"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118","Type":"ContainerStarted","Data":"1f8686b1922e0425ed1102482e059e392e43f7ba55fed20380712f74fceb3a36"} Mar 12 12:46:20.193720 master-0 kubenswrapper[13984]: I0312 12:46:20.193668 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:20.197401 master-0 kubenswrapper[13984]: I0312 12:46:20.196823 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-df3c-account-create-update-rbfjs" event={"ID":"f92e833d-9f24-4783-a5d6-89a7fa102026","Type":"ContainerDied","Data":"7f4b50056b5df0259a1f3a4ddbe67b57b7f0dfd8cf8239d65d3e945a68c53df3"} Mar 12 12:46:20.197401 master-0 kubenswrapper[13984]: I0312 12:46:20.196856 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4b50056b5df0259a1f3a4ddbe67b57b7f0dfd8cf8239d65d3e945a68c53df3" Mar 12 12:46:20.197401 master-0 kubenswrapper[13984]: I0312 12:46:20.196911 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-df3c-account-create-update-rbfjs" Mar 12 12:46:20.378816 master-0 kubenswrapper[13984]: I0312 12:46:20.378767 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ltqf\" (UniqueName: \"kubernetes.io/projected/f92e833d-9f24-4783-a5d6-89a7fa102026-kube-api-access-4ltqf\") pod \"f92e833d-9f24-4783-a5d6-89a7fa102026\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " Mar 12 12:46:20.379002 master-0 kubenswrapper[13984]: I0312 12:46:20.378902 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f92e833d-9f24-4783-a5d6-89a7fa102026-operator-scripts\") pod \"f92e833d-9f24-4783-a5d6-89a7fa102026\" (UID: \"f92e833d-9f24-4783-a5d6-89a7fa102026\") " Mar 12 12:46:20.379462 master-0 kubenswrapper[13984]: I0312 12:46:20.379438 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f92e833d-9f24-4783-a5d6-89a7fa102026-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f92e833d-9f24-4783-a5d6-89a7fa102026" (UID: "f92e833d-9f24-4783-a5d6-89a7fa102026"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:20.382055 master-0 kubenswrapper[13984]: I0312 12:46:20.382005 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f92e833d-9f24-4783-a5d6-89a7fa102026-kube-api-access-4ltqf" (OuterVolumeSpecName: "kube-api-access-4ltqf") pod "f92e833d-9f24-4783-a5d6-89a7fa102026" (UID: "f92e833d-9f24-4783-a5d6-89a7fa102026"). InnerVolumeSpecName "kube-api-access-4ltqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:20.382197 master-0 kubenswrapper[13984]: I0312 12:46:20.382165 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f92e833d-9f24-4783-a5d6-89a7fa102026-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:20.449061 master-0 kubenswrapper[13984]: I0312 12:46:20.448979 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" podStartSLOduration=4.448959794 podStartE2EDuration="4.448959794s" podCreationTimestamp="2026-03-12 12:46:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:20.44793461 +0000 UTC m=+1312.645950102" watchObservedRunningTime="2026-03-12 12:46:20.448959794 +0000 UTC m=+1312.646975296" Mar 12 12:46:20.484547 master-0 kubenswrapper[13984]: I0312 12:46:20.484498 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ltqf\" (UniqueName: \"kubernetes.io/projected/f92e833d-9f24-4783-a5d6-89a7fa102026-kube-api-access-4ltqf\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:20.485080 master-0 kubenswrapper[13984]: I0312 12:46:20.485047 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:20.770459 master-0 kubenswrapper[13984]: I0312 12:46:20.770331 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:21.426953 master-0 kubenswrapper[13984]: I0312 12:46:21.426894 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:22.231830 master-0 kubenswrapper[13984]: I0312 12:46:22.231694 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"1aa2287a-21d8-417d-85a7-49baade8ffd8","Type":"ContainerStarted","Data":"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c"} Mar 12 12:46:22.231830 master-0 kubenswrapper[13984]: I0312 12:46:22.231752 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"1aa2287a-21d8-417d-85a7-49baade8ffd8","Type":"ContainerStarted","Data":"50118f45ecc7c0340e306617ac6067556ae8402c2b33c5352edb91bd787d1c39"} Mar 12 12:46:22.546271 master-0 kubenswrapper[13984]: I0312 12:46:22.545905 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:22.558731 master-0 kubenswrapper[13984]: I0312 12:46:22.558677 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:24.289428 master-0 kubenswrapper[13984]: I0312 12:46:24.289299 13984 generic.go:334] "Generic (PLEG): container finished" podID="04c8b647-d290-4861-8709-6f5ce55141d3" containerID="2cd511b3f5082cee0c8dab442e8c47fbf12993f38966dcf97f415e18b360a966" exitCode=0 Mar 12 12:46:24.289428 master-0 kubenswrapper[13984]: I0312 12:46:24.289385 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4767l" event={"ID":"04c8b647-d290-4861-8709-6f5ce55141d3","Type":"ContainerDied","Data":"2cd511b3f5082cee0c8dab442e8c47fbf12993f38966dcf97f415e18b360a966"} Mar 12 12:46:24.294512 master-0 kubenswrapper[13984]: I0312 12:46:24.294424 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5pl65" event={"ID":"c8085b96-6908-4956-9e4c-290797974f87","Type":"ContainerStarted","Data":"302f561093f7d6e3870109b3038ba8c816e1647387e25759ec876f518657e4e0"} Mar 12 12:46:24.360446 master-0 kubenswrapper[13984]: I0312 12:46:24.357155 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-5pl65" podStartSLOduration=3.212033419 podStartE2EDuration="10.357135011s" podCreationTimestamp="2026-03-12 12:46:14 +0000 UTC" firstStartedPulling="2026-03-12 12:46:16.685978725 +0000 UTC m=+1308.883994217" lastFinishedPulling="2026-03-12 12:46:23.831080317 +0000 UTC m=+1316.029095809" observedRunningTime="2026-03-12 12:46:24.349302718 +0000 UTC m=+1316.547318210" watchObservedRunningTime="2026-03-12 12:46:24.357135011 +0000 UTC m=+1316.555150503" Mar 12 12:46:24.475798 master-0 kubenswrapper[13984]: W0312 12:46:24.475754 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cef203b_e921_4159_a618_9e20584a7fc8.slice/crio-76207373469a9034d9477d67d752fb1fefc4024299716cddecb6bca79400dc54 WatchSource:0}: Error finding container 76207373469a9034d9477d67d752fb1fefc4024299716cddecb6bca79400dc54: Status 404 returned error can't find the container with id 76207373469a9034d9477d67d752fb1fefc4024299716cddecb6bca79400dc54 Mar 12 12:46:24.477092 master-0 kubenswrapper[13984]: I0312 12:46:24.477036 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:25.036582 master-0 kubenswrapper[13984]: I0312 12:46:25.034920 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-ftw7v"] Mar 12 12:46:25.036582 master-0 kubenswrapper[13984]: E0312 12:46:25.035877 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f92e833d-9f24-4783-a5d6-89a7fa102026" containerName="mariadb-account-create-update" Mar 12 12:46:25.036582 master-0 kubenswrapper[13984]: I0312 12:46:25.035892 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="f92e833d-9f24-4783-a5d6-89a7fa102026" containerName="mariadb-account-create-update" Mar 12 12:46:25.036582 master-0 kubenswrapper[13984]: I0312 12:46:25.036305 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="f92e833d-9f24-4783-a5d6-89a7fa102026" containerName="mariadb-account-create-update" Mar 12 12:46:25.044529 master-0 kubenswrapper[13984]: I0312 12:46:25.042008 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.049019 master-0 kubenswrapper[13984]: I0312 12:46:25.046838 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 12 12:46:25.049019 master-0 kubenswrapper[13984]: I0312 12:46:25.047667 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 12 12:46:25.049842 master-0 kubenswrapper[13984]: I0312 12:46:25.049455 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-ftw7v"] Mar 12 12:46:25.148502 master-0 kubenswrapper[13984]: I0312 12:46:25.145087 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-scripts\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.148502 master-0 kubenswrapper[13984]: I0312 12:46:25.145164 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wh27\" (UniqueName: \"kubernetes.io/projected/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-kube-api-access-2wh27\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.148502 master-0 kubenswrapper[13984]: I0312 12:46:25.145191 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-combined-ca-bundle\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.148502 master-0 kubenswrapper[13984]: I0312 12:46:25.145455 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-etc-podinfo\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.148502 master-0 kubenswrapper[13984]: I0312 12:46:25.145530 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data-merged\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.148502 master-0 kubenswrapper[13984]: I0312 12:46:25.145631 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.247463 master-0 kubenswrapper[13984]: I0312 12:46:25.247218 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data-merged\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.247463 master-0 kubenswrapper[13984]: I0312 12:46:25.247400 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.247463 master-0 kubenswrapper[13984]: I0312 12:46:25.247446 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-scripts\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.247463 master-0 kubenswrapper[13984]: I0312 12:46:25.247501 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wh27\" (UniqueName: \"kubernetes.io/projected/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-kube-api-access-2wh27\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.247463 master-0 kubenswrapper[13984]: I0312 12:46:25.247534 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-combined-ca-bundle\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.247940 master-0 kubenswrapper[13984]: I0312 12:46:25.247621 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-etc-podinfo\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.250948 master-0 kubenswrapper[13984]: I0312 12:46:25.250853 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-etc-podinfo\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.251430 master-0 kubenswrapper[13984]: I0312 12:46:25.251401 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data-merged\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.252733 master-0 kubenswrapper[13984]: I0312 12:46:25.252517 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-scripts\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.255552 master-0 kubenswrapper[13984]: I0312 12:46:25.255083 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.256540 master-0 kubenswrapper[13984]: I0312 12:46:25.256454 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-combined-ca-bundle\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.283152 master-0 kubenswrapper[13984]: I0312 12:46:25.283107 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wh27\" (UniqueName: \"kubernetes.io/projected/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-kube-api-access-2wh27\") pod \"ironic-db-sync-ftw7v\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.312719 master-0 kubenswrapper[13984]: I0312 12:46:25.312382 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"1aa2287a-21d8-417d-85a7-49baade8ffd8","Type":"ContainerStarted","Data":"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403"} Mar 12 12:46:25.316691 master-0 kubenswrapper[13984]: I0312 12:46:25.315659 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"7cef203b-e921-4159-a618-9e20584a7fc8","Type":"ContainerStarted","Data":"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6"} Mar 12 12:46:25.316691 master-0 kubenswrapper[13984]: I0312 12:46:25.315736 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"7cef203b-e921-4159-a618-9e20584a7fc8","Type":"ContainerStarted","Data":"76207373469a9034d9477d67d752fb1fefc4024299716cddecb6bca79400dc54"} Mar 12 12:46:25.357768 master-0 kubenswrapper[13984]: I0312 12:46:25.357706 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:25.377663 master-0 kubenswrapper[13984]: I0312 12:46:25.375734 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f98a5-default-external-api-0" podStartSLOduration=8.375707006 podStartE2EDuration="8.375707006s" podCreationTimestamp="2026-03-12 12:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:25.361968414 +0000 UTC m=+1317.559983906" watchObservedRunningTime="2026-03-12 12:46:25.375707006 +0000 UTC m=+1317.573722498" Mar 12 12:46:25.383111 master-0 kubenswrapper[13984]: I0312 12:46:25.383062 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:46:25.477825 master-0 kubenswrapper[13984]: I0312 12:46:25.477668 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:25.954760 master-0 kubenswrapper[13984]: I0312 12:46:25.954699 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:26.073413 master-0 kubenswrapper[13984]: I0312 12:46:26.073363 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-combined-ca-bundle\") pod \"04c8b647-d290-4861-8709-6f5ce55141d3\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " Mar 12 12:46:26.073581 master-0 kubenswrapper[13984]: I0312 12:46:26.073463 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-config-data\") pod \"04c8b647-d290-4861-8709-6f5ce55141d3\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " Mar 12 12:46:26.073581 master-0 kubenswrapper[13984]: I0312 12:46:26.073573 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-fernet-keys\") pod \"04c8b647-d290-4861-8709-6f5ce55141d3\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " Mar 12 12:46:26.073687 master-0 kubenswrapper[13984]: I0312 12:46:26.073667 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-scripts\") pod \"04c8b647-d290-4861-8709-6f5ce55141d3\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " Mar 12 12:46:26.073728 master-0 kubenswrapper[13984]: I0312 12:46:26.073696 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-credential-keys\") pod \"04c8b647-d290-4861-8709-6f5ce55141d3\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " Mar 12 12:46:26.073774 master-0 kubenswrapper[13984]: I0312 12:46:26.073730 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx8jz\" (UniqueName: \"kubernetes.io/projected/04c8b647-d290-4861-8709-6f5ce55141d3-kube-api-access-xx8jz\") pod \"04c8b647-d290-4861-8709-6f5ce55141d3\" (UID: \"04c8b647-d290-4861-8709-6f5ce55141d3\") " Mar 12 12:46:26.088804 master-0 kubenswrapper[13984]: I0312 12:46:26.087070 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-ftw7v"] Mar 12 12:46:26.094825 master-0 kubenswrapper[13984]: I0312 12:46:26.094762 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "04c8b647-d290-4861-8709-6f5ce55141d3" (UID: "04c8b647-d290-4861-8709-6f5ce55141d3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:26.094825 master-0 kubenswrapper[13984]: I0312 12:46:26.094803 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-scripts" (OuterVolumeSpecName: "scripts") pod "04c8b647-d290-4861-8709-6f5ce55141d3" (UID: "04c8b647-d290-4861-8709-6f5ce55141d3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:26.095212 master-0 kubenswrapper[13984]: I0312 12:46:26.094877 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "04c8b647-d290-4861-8709-6f5ce55141d3" (UID: "04c8b647-d290-4861-8709-6f5ce55141d3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:26.095212 master-0 kubenswrapper[13984]: I0312 12:46:26.094931 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04c8b647-d290-4861-8709-6f5ce55141d3-kube-api-access-xx8jz" (OuterVolumeSpecName: "kube-api-access-xx8jz") pod "04c8b647-d290-4861-8709-6f5ce55141d3" (UID: "04c8b647-d290-4861-8709-6f5ce55141d3"). InnerVolumeSpecName "kube-api-access-xx8jz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:26.105067 master-0 kubenswrapper[13984]: I0312 12:46:26.105040 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04c8b647-d290-4861-8709-6f5ce55141d3" (UID: "04c8b647-d290-4861-8709-6f5ce55141d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:26.121575 master-0 kubenswrapper[13984]: I0312 12:46:26.121505 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-config-data" (OuterVolumeSpecName: "config-data") pod "04c8b647-d290-4861-8709-6f5ce55141d3" (UID: "04c8b647-d290-4861-8709-6f5ce55141d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:26.176023 master-0 kubenswrapper[13984]: I0312 12:46:26.175973 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:26.176023 master-0 kubenswrapper[13984]: I0312 12:46:26.176021 13984 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:26.176023 master-0 kubenswrapper[13984]: I0312 12:46:26.176034 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx8jz\" (UniqueName: \"kubernetes.io/projected/04c8b647-d290-4861-8709-6f5ce55141d3-kube-api-access-xx8jz\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:26.176310 master-0 kubenswrapper[13984]: I0312 12:46:26.176044 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:26.176310 master-0 kubenswrapper[13984]: I0312 12:46:26.176054 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:26.176310 master-0 kubenswrapper[13984]: I0312 12:46:26.176061 13984 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/04c8b647-d290-4861-8709-6f5ce55141d3-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:26.338794 master-0 kubenswrapper[13984]: I0312 12:46:26.338730 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-ftw7v" event={"ID":"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3","Type":"ContainerStarted","Data":"571b965bd681fb7e9d87242725b38ee35935bddca9f6a47491d4f0b27b813caf"} Mar 12 12:46:26.350923 master-0 kubenswrapper[13984]: I0312 12:46:26.350861 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"7cef203b-e921-4159-a618-9e20584a7fc8","Type":"ContainerStarted","Data":"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4"} Mar 12 12:46:26.351921 master-0 kubenswrapper[13984]: I0312 12:46:26.351901 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-internal-api-0" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-httpd" containerID="cri-o://75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4" gracePeriod=30 Mar 12 12:46:26.352047 master-0 kubenswrapper[13984]: I0312 12:46:26.352017 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-internal-api-0" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-log" containerID="cri-o://60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6" gracePeriod=30 Mar 12 12:46:26.359625 master-0 kubenswrapper[13984]: I0312 12:46:26.359593 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4767l" Mar 12 12:46:26.360352 master-0 kubenswrapper[13984]: I0312 12:46:26.360282 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4767l" event={"ID":"04c8b647-d290-4861-8709-6f5ce55141d3","Type":"ContainerDied","Data":"8473a43574fd6cbaac7dabaf2bc67a08e9b1df3bc7b9b32f27b5524aed878802"} Mar 12 12:46:26.360438 master-0 kubenswrapper[13984]: I0312 12:46:26.360372 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8473a43574fd6cbaac7dabaf2bc67a08e9b1df3bc7b9b32f27b5524aed878802" Mar 12 12:46:26.396554 master-0 kubenswrapper[13984]: I0312 12:46:26.396471 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f98a5-default-internal-api-0" podStartSLOduration=9.396448041 podStartE2EDuration="9.396448041s" podCreationTimestamp="2026-03-12 12:46:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:26.383566249 +0000 UTC m=+1318.581581741" watchObservedRunningTime="2026-03-12 12:46:26.396448041 +0000 UTC m=+1318.594463533" Mar 12 12:46:26.415814 master-0 kubenswrapper[13984]: I0312 12:46:26.415684 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4767l"] Mar 12 12:46:26.429353 master-0 kubenswrapper[13984]: I0312 12:46:26.429100 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4767l"] Mar 12 12:46:26.509807 master-0 kubenswrapper[13984]: I0312 12:46:26.509751 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-p7rh2"] Mar 12 12:46:26.510304 master-0 kubenswrapper[13984]: E0312 12:46:26.510280 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04c8b647-d290-4861-8709-6f5ce55141d3" containerName="keystone-bootstrap" Mar 12 12:46:26.510304 master-0 kubenswrapper[13984]: I0312 12:46:26.510298 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="04c8b647-d290-4861-8709-6f5ce55141d3" containerName="keystone-bootstrap" Mar 12 12:46:26.510593 master-0 kubenswrapper[13984]: I0312 12:46:26.510572 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="04c8b647-d290-4861-8709-6f5ce55141d3" containerName="keystone-bootstrap" Mar 12 12:46:26.511319 master-0 kubenswrapper[13984]: I0312 12:46:26.511293 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.514773 master-0 kubenswrapper[13984]: I0312 12:46:26.514735 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 12:46:26.514997 master-0 kubenswrapper[13984]: I0312 12:46:26.514965 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 12:46:26.515048 master-0 kubenswrapper[13984]: I0312 12:46:26.515002 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 12:46:26.528755 master-0 kubenswrapper[13984]: I0312 12:46:26.528687 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p7rh2"] Mar 12 12:46:26.584235 master-0 kubenswrapper[13984]: I0312 12:46:26.584178 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-combined-ca-bundle\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.584330 master-0 kubenswrapper[13984]: I0312 12:46:26.584254 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-scripts\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.584330 master-0 kubenswrapper[13984]: I0312 12:46:26.584279 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-fernet-keys\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.584430 master-0 kubenswrapper[13984]: I0312 12:46:26.584406 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-config-data\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.585536 master-0 kubenswrapper[13984]: I0312 12:46:26.584469 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-credential-keys\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.585536 master-0 kubenswrapper[13984]: I0312 12:46:26.584495 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgptq\" (UniqueName: \"kubernetes.io/projected/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-kube-api-access-mgptq\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.688996 master-0 kubenswrapper[13984]: I0312 12:46:26.688820 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-combined-ca-bundle\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.689795 master-0 kubenswrapper[13984]: I0312 12:46:26.689309 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-scripts\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.689795 master-0 kubenswrapper[13984]: I0312 12:46:26.689419 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-fernet-keys\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.689795 master-0 kubenswrapper[13984]: I0312 12:46:26.689645 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-config-data\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.689795 master-0 kubenswrapper[13984]: I0312 12:46:26.689758 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-credential-keys\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.689795 master-0 kubenswrapper[13984]: I0312 12:46:26.689787 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgptq\" (UniqueName: \"kubernetes.io/projected/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-kube-api-access-mgptq\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.698152 master-0 kubenswrapper[13984]: I0312 12:46:26.697226 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-credential-keys\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.698152 master-0 kubenswrapper[13984]: I0312 12:46:26.697375 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-fernet-keys\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.701537 master-0 kubenswrapper[13984]: I0312 12:46:26.699373 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-config-data\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.701537 master-0 kubenswrapper[13984]: I0312 12:46:26.701034 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-scripts\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.704625 master-0 kubenswrapper[13984]: I0312 12:46:26.704565 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-combined-ca-bundle\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.717247 master-0 kubenswrapper[13984]: I0312 12:46:26.717197 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgptq\" (UniqueName: \"kubernetes.io/projected/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-kube-api-access-mgptq\") pod \"keystone-bootstrap-p7rh2\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.906555 master-0 kubenswrapper[13984]: I0312 12:46:26.906483 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:26.998214 master-0 kubenswrapper[13984]: I0312 12:46:26.998176 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:46:27.086221 master-0 kubenswrapper[13984]: I0312 12:46:27.086133 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cdcd69d47-blzgv"] Mar 12 12:46:27.086473 master-0 kubenswrapper[13984]: I0312 12:46:27.086435 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerName="dnsmasq-dns" containerID="cri-o://5e0ef48e5d179583437b13e96b72a8564e98578539e7336b091e6f744caea2aa" gracePeriod=10 Mar 12 12:46:27.417007 master-0 kubenswrapper[13984]: I0312 12:46:27.416196 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:27.476300 master-0 kubenswrapper[13984]: W0312 12:46:27.476240 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc6207a9_cfba_4912_9ee2_7ae929ac28bd.slice/crio-e50d2b4578b50d9bbfa44a5860df051a73ecce925ba13678d870837a7693fa75 WatchSource:0}: Error finding container e50d2b4578b50d9bbfa44a5860df051a73ecce925ba13678d870837a7693fa75: Status 404 returned error can't find the container with id e50d2b4578b50d9bbfa44a5860df051a73ecce925ba13678d870837a7693fa75 Mar 12 12:46:27.476996 master-0 kubenswrapper[13984]: I0312 12:46:27.476515 13984 generic.go:334] "Generic (PLEG): container finished" podID="7cef203b-e921-4159-a618-9e20584a7fc8" containerID="75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4" exitCode=0 Mar 12 12:46:27.476996 master-0 kubenswrapper[13984]: I0312 12:46:27.476547 13984 generic.go:334] "Generic (PLEG): container finished" podID="7cef203b-e921-4159-a618-9e20584a7fc8" containerID="60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6" exitCode=143 Mar 12 12:46:27.476996 master-0 kubenswrapper[13984]: I0312 12:46:27.476636 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"7cef203b-e921-4159-a618-9e20584a7fc8","Type":"ContainerDied","Data":"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4"} Mar 12 12:46:27.476996 master-0 kubenswrapper[13984]: I0312 12:46:27.476666 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"7cef203b-e921-4159-a618-9e20584a7fc8","Type":"ContainerDied","Data":"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6"} Mar 12 12:46:27.476996 master-0 kubenswrapper[13984]: I0312 12:46:27.476678 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"7cef203b-e921-4159-a618-9e20584a7fc8","Type":"ContainerDied","Data":"76207373469a9034d9477d67d752fb1fefc4024299716cddecb6bca79400dc54"} Mar 12 12:46:27.476996 master-0 kubenswrapper[13984]: I0312 12:46:27.476695 13984 scope.go:117] "RemoveContainer" containerID="75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4" Mar 12 12:46:27.480960 master-0 kubenswrapper[13984]: I0312 12:46:27.480864 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-p7rh2"] Mar 12 12:46:27.481731 master-0 kubenswrapper[13984]: I0312 12:46:27.481698 13984 generic.go:334] "Generic (PLEG): container finished" podID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerID="5e0ef48e5d179583437b13e96b72a8564e98578539e7336b091e6f744caea2aa" exitCode=0 Mar 12 12:46:27.481833 master-0 kubenswrapper[13984]: I0312 12:46:27.481804 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" event={"ID":"e90795eb-e2e0-44f9-8e28-cece07a4230e","Type":"ContainerDied","Data":"5e0ef48e5d179583437b13e96b72a8564e98578539e7336b091e6f744caea2aa"} Mar 12 12:46:27.481950 master-0 kubenswrapper[13984]: I0312 12:46:27.481920 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-external-api-0" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-log" containerID="cri-o://fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c" gracePeriod=30 Mar 12 12:46:27.482057 master-0 kubenswrapper[13984]: I0312 12:46:27.482029 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-external-api-0" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-httpd" containerID="cri-o://1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403" gracePeriod=30 Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.532618 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-config-data\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.532761 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-logs\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.532837 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7js49\" (UniqueName: \"kubernetes.io/projected/7cef203b-e921-4159-a618-9e20584a7fc8-kube-api-access-7js49\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.533021 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.533079 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-scripts\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.533137 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-combined-ca-bundle\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.534555 master-0 kubenswrapper[13984]: I0312 12:46:27.533185 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-httpd-run\") pod \"7cef203b-e921-4159-a618-9e20584a7fc8\" (UID: \"7cef203b-e921-4159-a618-9e20584a7fc8\") " Mar 12 12:46:27.536168 master-0 kubenswrapper[13984]: I0312 12:46:27.535646 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-logs" (OuterVolumeSpecName: "logs") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:46:27.539521 master-0 kubenswrapper[13984]: I0312 12:46:27.537282 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:46:27.545173 master-0 kubenswrapper[13984]: I0312 12:46:27.540614 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cef203b-e921-4159-a618-9e20584a7fc8-kube-api-access-7js49" (OuterVolumeSpecName: "kube-api-access-7js49") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "kube-api-access-7js49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:27.545173 master-0 kubenswrapper[13984]: I0312 12:46:27.542866 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-scripts" (OuterVolumeSpecName: "scripts") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:27.554743 master-0 kubenswrapper[13984]: I0312 12:46:27.554652 13984 scope.go:117] "RemoveContainer" containerID="60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6" Mar 12 12:46:27.569261 master-0 kubenswrapper[13984]: I0312 12:46:27.569198 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616" (OuterVolumeSpecName: "glance") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "pvc-49a24b6d-1684-4129-8811-b908648797de". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 12:46:27.611822 master-0 kubenswrapper[13984]: I0312 12:46:27.611737 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:27.616202 master-0 kubenswrapper[13984]: I0312 12:46:27.616162 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-config-data" (OuterVolumeSpecName: "config-data") pod "7cef203b-e921-4159-a618-9e20584a7fc8" (UID: "7cef203b-e921-4159-a618-9e20584a7fc8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635667 13984 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635747 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635759 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7cef203b-e921-4159-a618-9e20584a7fc8-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635781 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7js49\" (UniqueName: \"kubernetes.io/projected/7cef203b-e921-4159-a618-9e20584a7fc8-kube-api-access-7js49\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635846 13984 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") on node \"master-0\" " Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635858 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.635867 master-0 kubenswrapper[13984]: I0312 12:46:27.635868 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cef203b-e921-4159-a618-9e20584a7fc8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.648233 master-0 kubenswrapper[13984]: I0312 12:46:27.643747 13984 scope.go:117] "RemoveContainer" containerID="75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4" Mar 12 12:46:27.652889 master-0 kubenswrapper[13984]: E0312 12:46:27.648341 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4\": container with ID starting with 75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4 not found: ID does not exist" containerID="75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4" Mar 12 12:46:27.652889 master-0 kubenswrapper[13984]: I0312 12:46:27.648427 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4"} err="failed to get container status \"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4\": rpc error: code = NotFound desc = could not find container \"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4\": container with ID starting with 75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4 not found: ID does not exist" Mar 12 12:46:27.652889 master-0 kubenswrapper[13984]: I0312 12:46:27.648462 13984 scope.go:117] "RemoveContainer" containerID="60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6" Mar 12 12:46:27.653544 master-0 kubenswrapper[13984]: E0312 12:46:27.653474 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6\": container with ID starting with 60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6 not found: ID does not exist" containerID="60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6" Mar 12 12:46:27.653600 master-0 kubenswrapper[13984]: I0312 12:46:27.653552 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6"} err="failed to get container status \"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6\": rpc error: code = NotFound desc = could not find container \"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6\": container with ID starting with 60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6 not found: ID does not exist" Mar 12 12:46:27.653600 master-0 kubenswrapper[13984]: I0312 12:46:27.653581 13984 scope.go:117] "RemoveContainer" containerID="75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4" Mar 12 12:46:27.654155 master-0 kubenswrapper[13984]: I0312 12:46:27.654096 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4"} err="failed to get container status \"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4\": rpc error: code = NotFound desc = could not find container \"75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4\": container with ID starting with 75a393c2e4f323618df81c13b05708607413af4ff4642cb1b90aa52495f1f0b4 not found: ID does not exist" Mar 12 12:46:27.654214 master-0 kubenswrapper[13984]: I0312 12:46:27.654158 13984 scope.go:117] "RemoveContainer" containerID="60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6" Mar 12 12:46:27.654665 master-0 kubenswrapper[13984]: I0312 12:46:27.654600 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6"} err="failed to get container status \"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6\": rpc error: code = NotFound desc = could not find container \"60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6\": container with ID starting with 60388c6c1a3112a096268a4e309f54d0d71118797175bd73f8de33c8deeb00f6 not found: ID does not exist" Mar 12 12:46:27.676609 master-0 kubenswrapper[13984]: I0312 12:46:27.675979 13984 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 12 12:46:27.676609 master-0 kubenswrapper[13984]: I0312 12:46:27.676160 13984 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-49a24b6d-1684-4129-8811-b908648797de" (UniqueName: "kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616") on node "master-0" Mar 12 12:46:27.738238 master-0 kubenswrapper[13984]: I0312 12:46:27.738188 13984 reconciler_common.go:293] "Volume detached for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:27.932764 master-0 kubenswrapper[13984]: E0312 12:46:27.932694 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa2287a_21d8_417d_85a7_49baade8ffd8.slice/crio-conmon-1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1aa2287a_21d8_417d_85a7_49baade8ffd8.slice/crio-1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403.scope\": RecentStats: unable to find data in memory cache]" Mar 12 12:46:28.012699 master-0 kubenswrapper[13984]: I0312 12:46:28.012628 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04c8b647-d290-4861-8709-6f5ce55141d3" path="/var/lib/kubelet/pods/04c8b647-d290-4861-8709-6f5ce55141d3/volumes" Mar 12 12:46:28.296776 master-0 kubenswrapper[13984]: I0312 12:46:28.296703 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:46:28.426723 master-0 kubenswrapper[13984]: I0312 12:46:28.426609 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:28.458594 master-0 kubenswrapper[13984]: I0312 12:46:28.453241 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-svc\") pod \"e90795eb-e2e0-44f9-8e28-cece07a4230e\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " Mar 12 12:46:28.458594 master-0 kubenswrapper[13984]: I0312 12:46:28.453391 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-nb\") pod \"e90795eb-e2e0-44f9-8e28-cece07a4230e\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " Mar 12 12:46:28.458594 master-0 kubenswrapper[13984]: I0312 12:46:28.453467 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-swift-storage-0\") pod \"e90795eb-e2e0-44f9-8e28-cece07a4230e\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " Mar 12 12:46:28.458594 master-0 kubenswrapper[13984]: I0312 12:46:28.453545 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rc7x7\" (UniqueName: \"kubernetes.io/projected/e90795eb-e2e0-44f9-8e28-cece07a4230e-kube-api-access-rc7x7\") pod \"e90795eb-e2e0-44f9-8e28-cece07a4230e\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " Mar 12 12:46:28.458594 master-0 kubenswrapper[13984]: I0312 12:46:28.453659 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-sb\") pod \"e90795eb-e2e0-44f9-8e28-cece07a4230e\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " Mar 12 12:46:28.458594 master-0 kubenswrapper[13984]: I0312 12:46:28.453704 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-config\") pod \"e90795eb-e2e0-44f9-8e28-cece07a4230e\" (UID: \"e90795eb-e2e0-44f9-8e28-cece07a4230e\") " Mar 12 12:46:28.462559 master-0 kubenswrapper[13984]: I0312 12:46:28.461708 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e90795eb-e2e0-44f9-8e28-cece07a4230e-kube-api-access-rc7x7" (OuterVolumeSpecName: "kube-api-access-rc7x7") pod "e90795eb-e2e0-44f9-8e28-cece07a4230e" (UID: "e90795eb-e2e0-44f9-8e28-cece07a4230e"). InnerVolumeSpecName "kube-api-access-rc7x7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:28.524035 master-0 kubenswrapper[13984]: I0312 12:46:28.523768 13984 generic.go:334] "Generic (PLEG): container finished" podID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerID="1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403" exitCode=0 Mar 12 12:46:28.524035 master-0 kubenswrapper[13984]: I0312 12:46:28.523816 13984 generic.go:334] "Generic (PLEG): container finished" podID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerID="fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c" exitCode=143 Mar 12 12:46:28.524035 master-0 kubenswrapper[13984]: I0312 12:46:28.523828 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"1aa2287a-21d8-417d-85a7-49baade8ffd8","Type":"ContainerDied","Data":"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403"} Mar 12 12:46:28.524035 master-0 kubenswrapper[13984]: I0312 12:46:28.523878 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:28.524035 master-0 kubenswrapper[13984]: I0312 12:46:28.523890 13984 scope.go:117] "RemoveContainer" containerID="1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403" Mar 12 12:46:28.526200 master-0 kubenswrapper[13984]: I0312 12:46:28.523879 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"1aa2287a-21d8-417d-85a7-49baade8ffd8","Type":"ContainerDied","Data":"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c"} Mar 12 12:46:28.526200 master-0 kubenswrapper[13984]: I0312 12:46:28.524740 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"1aa2287a-21d8-417d-85a7-49baade8ffd8","Type":"ContainerDied","Data":"50118f45ecc7c0340e306617ac6067556ae8402c2b33c5352edb91bd787d1c39"} Mar 12 12:46:28.531128 master-0 kubenswrapper[13984]: I0312 12:46:28.531048 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" event={"ID":"e90795eb-e2e0-44f9-8e28-cece07a4230e","Type":"ContainerDied","Data":"01d9dc37a611215f8476111b5d1f9320393922c76695a7186b8722bb5537cc6e"} Mar 12 12:46:28.531128 master-0 kubenswrapper[13984]: I0312 12:46:28.531067 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cdcd69d47-blzgv" Mar 12 12:46:28.547705 master-0 kubenswrapper[13984]: I0312 12:46:28.546012 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p7rh2" event={"ID":"dc6207a9-cfba-4912-9ee2-7ae929ac28bd","Type":"ContainerStarted","Data":"a5aa62d5bf411d8ef2f51678de8b42959a273fc854aecd20230730cbe6a556ea"} Mar 12 12:46:28.547705 master-0 kubenswrapper[13984]: I0312 12:46:28.546087 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p7rh2" event={"ID":"dc6207a9-cfba-4912-9ee2-7ae929ac28bd","Type":"ContainerStarted","Data":"e50d2b4578b50d9bbfa44a5860df051a73ecce925ba13678d870837a7693fa75"} Mar 12 12:46:28.548238 master-0 kubenswrapper[13984]: I0312 12:46:28.548185 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e90795eb-e2e0-44f9-8e28-cece07a4230e" (UID: "e90795eb-e2e0-44f9-8e28-cece07a4230e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:28.551424 master-0 kubenswrapper[13984]: I0312 12:46:28.551141 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.552358 master-0 kubenswrapper[13984]: I0312 12:46:28.552227 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e90795eb-e2e0-44f9-8e28-cece07a4230e" (UID: "e90795eb-e2e0-44f9-8e28-cece07a4230e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:28.555578 master-0 kubenswrapper[13984]: I0312 12:46:28.555532 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-config-data\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.555706 master-0 kubenswrapper[13984]: I0312 12:46:28.555621 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-httpd-run\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.555768 master-0 kubenswrapper[13984]: I0312 12:46:28.555759 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.556231 master-0 kubenswrapper[13984]: I0312 12:46:28.555873 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwkps\" (UniqueName: \"kubernetes.io/projected/1aa2287a-21d8-417d-85a7-49baade8ffd8-kube-api-access-qwkps\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.556231 master-0 kubenswrapper[13984]: I0312 12:46:28.555964 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-logs\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.556231 master-0 kubenswrapper[13984]: I0312 12:46:28.555987 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-combined-ca-bundle\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.556231 master-0 kubenswrapper[13984]: I0312 12:46:28.556098 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-scripts\") pod \"1aa2287a-21d8-417d-85a7-49baade8ffd8\" (UID: \"1aa2287a-21d8-417d-85a7-49baade8ffd8\") " Mar 12 12:46:28.556231 master-0 kubenswrapper[13984]: I0312 12:46:28.556112 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:46:28.557177 master-0 kubenswrapper[13984]: I0312 12:46:28.556544 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-logs" (OuterVolumeSpecName: "logs") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:46:28.557177 master-0 kubenswrapper[13984]: I0312 12:46:28.556626 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rc7x7\" (UniqueName: \"kubernetes.io/projected/e90795eb-e2e0-44f9-8e28-cece07a4230e-kube-api-access-rc7x7\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.557177 master-0 kubenswrapper[13984]: I0312 12:46:28.556651 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.557177 master-0 kubenswrapper[13984]: I0312 12:46:28.556674 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.557177 master-0 kubenswrapper[13984]: I0312 12:46:28.556685 13984 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.557676 master-0 kubenswrapper[13984]: I0312 12:46:28.557625 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e90795eb-e2e0-44f9-8e28-cece07a4230e" (UID: "e90795eb-e2e0-44f9-8e28-cece07a4230e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:28.561128 master-0 kubenswrapper[13984]: I0312 12:46:28.561086 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1aa2287a-21d8-417d-85a7-49baade8ffd8-kube-api-access-qwkps" (OuterVolumeSpecName: "kube-api-access-qwkps") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "kube-api-access-qwkps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:28.564378 master-0 kubenswrapper[13984]: I0312 12:46:28.564333 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-scripts" (OuterVolumeSpecName: "scripts") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:28.568475 master-0 kubenswrapper[13984]: I0312 12:46:28.568418 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e90795eb-e2e0-44f9-8e28-cece07a4230e" (UID: "e90795eb-e2e0-44f9-8e28-cece07a4230e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:28.604907 master-0 kubenswrapper[13984]: I0312 12:46:28.604781 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-p7rh2" podStartSLOduration=2.603549977 podStartE2EDuration="2.603549977s" podCreationTimestamp="2026-03-12 12:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:28.564057663 +0000 UTC m=+1320.762073165" watchObservedRunningTime="2026-03-12 12:46:28.603549977 +0000 UTC m=+1320.801565469" Mar 12 12:46:28.608080 master-0 kubenswrapper[13984]: I0312 12:46:28.608017 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:28.614930 master-0 kubenswrapper[13984]: I0312 12:46:28.614871 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e" (OuterVolumeSpecName: "glance") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 12:46:28.631850 master-0 kubenswrapper[13984]: I0312 12:46:28.631308 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-config" (OuterVolumeSpecName: "config") pod "e90795eb-e2e0-44f9-8e28-cece07a4230e" (UID: "e90795eb-e2e0-44f9-8e28-cece07a4230e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:46:28.634007 master-0 kubenswrapper[13984]: I0312 12:46:28.633933 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:28.644561 master-0 kubenswrapper[13984]: I0312 12:46:28.644427 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:28.652759 master-0 kubenswrapper[13984]: I0312 12:46:28.652671 13984 scope.go:117] "RemoveContainer" containerID="fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653287 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: E0312 12:46:28.653794 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-log" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653808 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-log" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: E0312 12:46:28.653822 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-httpd" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653828 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-httpd" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: E0312 12:46:28.653843 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-httpd" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653849 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-httpd" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: E0312 12:46:28.653876 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerName="dnsmasq-dns" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653881 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerName="dnsmasq-dns" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: E0312 12:46:28.653895 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerName="init" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653943 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerName="init" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: E0312 12:46:28.653954 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-log" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.653961 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-log" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.654168 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" containerName="dnsmasq-dns" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.654181 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-log" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.654192 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" containerName="glance-httpd" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.654201 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-log" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.654208 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" containerName="glance-httpd" Mar 12 12:46:28.655710 master-0 kubenswrapper[13984]: I0312 12:46:28.655175 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.657937 master-0 kubenswrapper[13984]: I0312 12:46:28.657312 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-default-internal-config-data" Mar 12 12:46:28.657937 master-0 kubenswrapper[13984]: I0312 12:46:28.657627 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659688 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1aa2287a-21d8-417d-85a7-49baade8ffd8-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659727 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659740 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659756 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659782 13984 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") on node \"master-0\" " Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659798 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659813 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwkps\" (UniqueName: \"kubernetes.io/projected/1aa2287a-21d8-417d-85a7-49baade8ffd8-kube-api-access-qwkps\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.660168 master-0 kubenswrapper[13984]: I0312 12:46:28.659828 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e90795eb-e2e0-44f9-8e28-cece07a4230e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.691270 master-0 kubenswrapper[13984]: I0312 12:46:28.691113 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-config-data" (OuterVolumeSpecName: "config-data") pod "1aa2287a-21d8-417d-85a7-49baade8ffd8" (UID: "1aa2287a-21d8-417d-85a7-49baade8ffd8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:28.723046 master-0 kubenswrapper[13984]: I0312 12:46:28.722911 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:28.739978 master-0 kubenswrapper[13984]: I0312 12:46:28.739943 13984 scope.go:117] "RemoveContainer" containerID="1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403" Mar 12 12:46:28.741437 master-0 kubenswrapper[13984]: E0312 12:46:28.740949 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403\": container with ID starting with 1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403 not found: ID does not exist" containerID="1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403" Mar 12 12:46:28.741437 master-0 kubenswrapper[13984]: I0312 12:46:28.741017 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403"} err="failed to get container status \"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403\": rpc error: code = NotFound desc = could not find container \"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403\": container with ID starting with 1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403 not found: ID does not exist" Mar 12 12:46:28.741437 master-0 kubenswrapper[13984]: I0312 12:46:28.741043 13984 scope.go:117] "RemoveContainer" containerID="fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c" Mar 12 12:46:28.741917 master-0 kubenswrapper[13984]: E0312 12:46:28.741867 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c\": container with ID starting with fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c not found: ID does not exist" containerID="fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c" Mar 12 12:46:28.741973 master-0 kubenswrapper[13984]: I0312 12:46:28.741922 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c"} err="failed to get container status \"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c\": rpc error: code = NotFound desc = could not find container \"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c\": container with ID starting with fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c not found: ID does not exist" Mar 12 12:46:28.741973 master-0 kubenswrapper[13984]: I0312 12:46:28.741953 13984 scope.go:117] "RemoveContainer" containerID="1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403" Mar 12 12:46:28.758222 master-0 kubenswrapper[13984]: I0312 12:46:28.757908 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403"} err="failed to get container status \"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403\": rpc error: code = NotFound desc = could not find container \"1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403\": container with ID starting with 1f33f86e2aaff9821b50640c797832bf861fafc2f2be03751b1df2085b6fc403 not found: ID does not exist" Mar 12 12:46:28.758222 master-0 kubenswrapper[13984]: I0312 12:46:28.757962 13984 scope.go:117] "RemoveContainer" containerID="fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c" Mar 12 12:46:28.758632 master-0 kubenswrapper[13984]: I0312 12:46:28.758599 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c"} err="failed to get container status \"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c\": rpc error: code = NotFound desc = could not find container \"fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c\": container with ID starting with fccf0993d4bbc12bc4d3df9842ec43e822a5e77cfbe07e4540e478ad24c4200c not found: ID does not exist" Mar 12 12:46:28.758632 master-0 kubenswrapper[13984]: I0312 12:46:28.758621 13984 scope.go:117] "RemoveContainer" containerID="5e0ef48e5d179583437b13e96b72a8564e98578539e7336b091e6f744caea2aa" Mar 12 12:46:28.779776 master-0 kubenswrapper[13984]: I0312 12:46:28.779720 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.780023 master-0 kubenswrapper[13984]: I0312 12:46:28.779868 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.780023 master-0 kubenswrapper[13984]: I0312 12:46:28.779914 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.780023 master-0 kubenswrapper[13984]: I0312 12:46:28.780017 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kk7d\" (UniqueName: \"kubernetes.io/projected/4b525941-42e8-455b-b117-fae2d47b7977-kube-api-access-8kk7d\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.780389 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-internal-tls-certs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.780443 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.780515 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.780539 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.780852 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1aa2287a-21d8-417d-85a7-49baade8ffd8-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.781064 13984 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 12 12:46:28.781660 master-0 kubenswrapper[13984]: I0312 12:46:28.781200 13984 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383" (UniqueName: "kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e") on node "master-0" Mar 12 12:46:28.822885 master-0 kubenswrapper[13984]: I0312 12:46:28.822772 13984 scope.go:117] "RemoveContainer" containerID="f7e33c7c00afd4252c842a3cf5b45d23b8311281def545680df537f481cbaea6" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.886652 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kk7d\" (UniqueName: \"kubernetes.io/projected/4b525941-42e8-455b-b117-fae2d47b7977-kube-api-access-8kk7d\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.886759 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-internal-tls-certs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.886799 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.886864 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.886888 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.887169 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.887280 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.887327 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.887414 13984 reconciler_common.go:293] "Volume detached for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.888365 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.890287 master-0 kubenswrapper[13984]: I0312 12:46:28.888574 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.900352 master-0 kubenswrapper[13984]: I0312 12:46:28.892622 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.902330 master-0 kubenswrapper[13984]: I0312 12:46:28.902281 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.902978 master-0 kubenswrapper[13984]: I0312 12:46:28.902955 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:46:28.903068 master-0 kubenswrapper[13984]: I0312 12:46:28.902985 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/95c4692dce0b9808ca42bee34ff116315d549daf622dbdf12b07464f0e4f3ac4/globalmount\"" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.906404 master-0 kubenswrapper[13984]: I0312 12:46:28.906338 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.913684 master-0 kubenswrapper[13984]: I0312 12:46:28.913616 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:28.916317 master-0 kubenswrapper[13984]: I0312 12:46:28.915365 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-internal-tls-certs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.928637 master-0 kubenswrapper[13984]: I0312 12:46:28.928597 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kk7d\" (UniqueName: \"kubernetes.io/projected/4b525941-42e8-455b-b117-fae2d47b7977-kube-api-access-8kk7d\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:28.933986 master-0 kubenswrapper[13984]: I0312 12:46:28.933912 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:28.971527 master-0 kubenswrapper[13984]: I0312 12:46:28.962166 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cdcd69d47-blzgv"] Mar 12 12:46:28.981134 master-0 kubenswrapper[13984]: I0312 12:46:28.980988 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cdcd69d47-blzgv"] Mar 12 12:46:28.991535 master-0 kubenswrapper[13984]: I0312 12:46:28.990483 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:28.998498 master-0 kubenswrapper[13984]: I0312 12:46:28.997740 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.005944 master-0 kubenswrapper[13984]: I0312 12:46:29.003925 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 12:46:29.005944 master-0 kubenswrapper[13984]: I0312 12:46:29.004419 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:29.014921 master-0 kubenswrapper[13984]: I0312 12:46:29.010060 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-default-external-config-data" Mar 12 12:46:29.092984 master-0 kubenswrapper[13984]: I0312 12:46:29.092919 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093213 master-0 kubenswrapper[13984]: I0312 12:46:29.093005 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093213 master-0 kubenswrapper[13984]: I0312 12:46:29.093049 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-public-tls-certs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093213 master-0 kubenswrapper[13984]: I0312 12:46:29.093140 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093213 master-0 kubenswrapper[13984]: I0312 12:46:29.093196 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093431 master-0 kubenswrapper[13984]: I0312 12:46:29.093243 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093431 master-0 kubenswrapper[13984]: I0312 12:46:29.093307 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4t2g\" (UniqueName: \"kubernetes.io/projected/920ddbc6-460a-455d-b6eb-8a9393dc169e-kube-api-access-k4t2g\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.093530 master-0 kubenswrapper[13984]: I0312 12:46:29.093431 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.195459 master-0 kubenswrapper[13984]: I0312 12:46:29.195391 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.195902 master-0 kubenswrapper[13984]: I0312 12:46:29.195852 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.197118 master-0 kubenswrapper[13984]: I0312 12:46:29.197098 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.197216 master-0 kubenswrapper[13984]: I0312 12:46:29.197138 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4t2g\" (UniqueName: \"kubernetes.io/projected/920ddbc6-460a-455d-b6eb-8a9393dc169e-kube-api-access-k4t2g\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.197430 master-0 kubenswrapper[13984]: I0312 12:46:29.197407 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.197633 master-0 kubenswrapper[13984]: I0312 12:46:29.197616 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.197766 master-0 kubenswrapper[13984]: I0312 12:46:29.197753 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.197853 master-0 kubenswrapper[13984]: I0312 12:46:29.197841 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-public-tls-certs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.198399 master-0 kubenswrapper[13984]: I0312 12:46:29.198382 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.198846 master-0 kubenswrapper[13984]: I0312 12:46:29.198810 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.199746 master-0 kubenswrapper[13984]: I0312 12:46:29.199704 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:46:29.199823 master-0 kubenswrapper[13984]: I0312 12:46:29.199752 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/705cac2184d649d44bc54d9fe6c523322613cb4b44faad7af7e5abd2b4c3196c/globalmount\"" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.201017 master-0 kubenswrapper[13984]: I0312 12:46:29.200937 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-public-tls-certs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.203057 master-0 kubenswrapper[13984]: I0312 12:46:29.202998 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.203353 master-0 kubenswrapper[13984]: I0312 12:46:29.203318 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.205047 master-0 kubenswrapper[13984]: I0312 12:46:29.205015 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:29.224722 master-0 kubenswrapper[13984]: I0312 12:46:29.221670 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4t2g\" (UniqueName: \"kubernetes.io/projected/920ddbc6-460a-455d-b6eb-8a9393dc169e-kube-api-access-k4t2g\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:30.000100 master-0 kubenswrapper[13984]: I0312 12:46:29.999800 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1aa2287a-21d8-417d-85a7-49baade8ffd8" path="/var/lib/kubelet/pods/1aa2287a-21d8-417d-85a7-49baade8ffd8/volumes" Mar 12 12:46:30.006703 master-0 kubenswrapper[13984]: I0312 12:46:30.001853 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cef203b-e921-4159-a618-9e20584a7fc8" path="/var/lib/kubelet/pods/7cef203b-e921-4159-a618-9e20584a7fc8/volumes" Mar 12 12:46:30.006703 master-0 kubenswrapper[13984]: I0312 12:46:30.002967 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e90795eb-e2e0-44f9-8e28-cece07a4230e" path="/var/lib/kubelet/pods/e90795eb-e2e0-44f9-8e28-cece07a4230e/volumes" Mar 12 12:46:30.283720 master-0 kubenswrapper[13984]: I0312 12:46:30.283441 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:30.509403 master-0 kubenswrapper[13984]: I0312 12:46:30.509362 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:31.130748 master-0 kubenswrapper[13984]: I0312 12:46:31.130694 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:46:31.676734 master-0 kubenswrapper[13984]: I0312 12:46:31.676570 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:31.744763 master-0 kubenswrapper[13984]: I0312 12:46:31.744395 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:31.798609 master-0 kubenswrapper[13984]: I0312 12:46:31.798536 13984 trace.go:236] Trace[110969057]: "Calculate volume metrics of mysql-db for pod openstack/openstack-cell1-galera-0" (12-Mar-2026 12:46:29.485) (total time: 2312ms): Mar 12 12:46:31.798609 master-0 kubenswrapper[13984]: Trace[110969057]: [2.312715909s] [2.312715909s] END Mar 12 12:46:33.628504 master-0 kubenswrapper[13984]: I0312 12:46:33.628355 13984 generic.go:334] "Generic (PLEG): container finished" podID="c8085b96-6908-4956-9e4c-290797974f87" containerID="302f561093f7d6e3870109b3038ba8c816e1647387e25759ec876f518657e4e0" exitCode=0 Mar 12 12:46:33.628504 master-0 kubenswrapper[13984]: I0312 12:46:33.628419 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5pl65" event={"ID":"c8085b96-6908-4956-9e4c-290797974f87","Type":"ContainerDied","Data":"302f561093f7d6e3870109b3038ba8c816e1647387e25759ec876f518657e4e0"} Mar 12 12:46:42.909629 master-0 kubenswrapper[13984]: I0312 12:46:42.907966 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:43.010169 master-0 kubenswrapper[13984]: I0312 12:46:43.010047 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8085b96-6908-4956-9e4c-290797974f87-logs\") pod \"c8085b96-6908-4956-9e4c-290797974f87\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " Mar 12 12:46:43.010169 master-0 kubenswrapper[13984]: I0312 12:46:43.010102 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-config-data\") pod \"c8085b96-6908-4956-9e4c-290797974f87\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " Mar 12 12:46:43.010417 master-0 kubenswrapper[13984]: I0312 12:46:43.010206 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-combined-ca-bundle\") pod \"c8085b96-6908-4956-9e4c-290797974f87\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " Mar 12 12:46:43.010417 master-0 kubenswrapper[13984]: I0312 12:46:43.010279 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-947cb\" (UniqueName: \"kubernetes.io/projected/c8085b96-6908-4956-9e4c-290797974f87-kube-api-access-947cb\") pod \"c8085b96-6908-4956-9e4c-290797974f87\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " Mar 12 12:46:43.010417 master-0 kubenswrapper[13984]: I0312 12:46:43.010353 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-scripts\") pod \"c8085b96-6908-4956-9e4c-290797974f87\" (UID: \"c8085b96-6908-4956-9e4c-290797974f87\") " Mar 12 12:46:43.010577 master-0 kubenswrapper[13984]: I0312 12:46:43.010464 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8085b96-6908-4956-9e4c-290797974f87-logs" (OuterVolumeSpecName: "logs") pod "c8085b96-6908-4956-9e4c-290797974f87" (UID: "c8085b96-6908-4956-9e4c-290797974f87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:46:43.013994 master-0 kubenswrapper[13984]: I0312 12:46:43.013933 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c8085b96-6908-4956-9e4c-290797974f87-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:43.014754 master-0 kubenswrapper[13984]: I0312 12:46:43.014710 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8085b96-6908-4956-9e4c-290797974f87-kube-api-access-947cb" (OuterVolumeSpecName: "kube-api-access-947cb") pod "c8085b96-6908-4956-9e4c-290797974f87" (UID: "c8085b96-6908-4956-9e4c-290797974f87"). InnerVolumeSpecName "kube-api-access-947cb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:43.017296 master-0 kubenswrapper[13984]: I0312 12:46:43.017187 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-scripts" (OuterVolumeSpecName: "scripts") pod "c8085b96-6908-4956-9e4c-290797974f87" (UID: "c8085b96-6908-4956-9e4c-290797974f87"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:43.047225 master-0 kubenswrapper[13984]: I0312 12:46:43.046891 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-config-data" (OuterVolumeSpecName: "config-data") pod "c8085b96-6908-4956-9e4c-290797974f87" (UID: "c8085b96-6908-4956-9e4c-290797974f87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:43.049081 master-0 kubenswrapper[13984]: I0312 12:46:43.049039 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c8085b96-6908-4956-9e4c-290797974f87" (UID: "c8085b96-6908-4956-9e4c-290797974f87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:43.117338 master-0 kubenswrapper[13984]: I0312 12:46:43.116927 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:43.117338 master-0 kubenswrapper[13984]: I0312 12:46:43.116974 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:43.117338 master-0 kubenswrapper[13984]: I0312 12:46:43.116989 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-947cb\" (UniqueName: \"kubernetes.io/projected/c8085b96-6908-4956-9e4c-290797974f87-kube-api-access-947cb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:43.117338 master-0 kubenswrapper[13984]: I0312 12:46:43.117003 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c8085b96-6908-4956-9e4c-290797974f87-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:43.772677 master-0 kubenswrapper[13984]: I0312 12:46:43.772436 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-5pl65" event={"ID":"c8085b96-6908-4956-9e4c-290797974f87","Type":"ContainerDied","Data":"c6dd281ef0960bbb187a9bb02d0fb3f242851b4396175a946c3682cdb441e9cf"} Mar 12 12:46:43.772677 master-0 kubenswrapper[13984]: I0312 12:46:43.772496 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6dd281ef0960bbb187a9bb02d0fb3f242851b4396175a946c3682cdb441e9cf" Mar 12 12:46:43.772677 master-0 kubenswrapper[13984]: I0312 12:46:43.772582 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-5pl65" Mar 12 12:46:43.780169 master-0 kubenswrapper[13984]: I0312 12:46:43.780130 13984 generic.go:334] "Generic (PLEG): container finished" podID="dc6207a9-cfba-4912-9ee2-7ae929ac28bd" containerID="a5aa62d5bf411d8ef2f51678de8b42959a273fc854aecd20230730cbe6a556ea" exitCode=0 Mar 12 12:46:43.780390 master-0 kubenswrapper[13984]: I0312 12:46:43.780191 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p7rh2" event={"ID":"dc6207a9-cfba-4912-9ee2-7ae929ac28bd","Type":"ContainerDied","Data":"a5aa62d5bf411d8ef2f51678de8b42959a273fc854aecd20230730cbe6a556ea"} Mar 12 12:46:43.781651 master-0 kubenswrapper[13984]: I0312 12:46:43.781619 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"4b525941-42e8-455b-b117-fae2d47b7977","Type":"ContainerStarted","Data":"ebab59b5ea049c72253b7bbe7432951627489d78e0bcb1162b8c4ac28cd3f2e4"} Mar 12 12:46:44.585684 master-0 kubenswrapper[13984]: I0312 12:46:44.585295 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5cbb6f6f66-dz95j"] Mar 12 12:46:44.586423 master-0 kubenswrapper[13984]: E0312 12:46:44.585910 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8085b96-6908-4956-9e4c-290797974f87" containerName="placement-db-sync" Mar 12 12:46:44.586423 master-0 kubenswrapper[13984]: I0312 12:46:44.585933 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8085b96-6908-4956-9e4c-290797974f87" containerName="placement-db-sync" Mar 12 12:46:44.586423 master-0 kubenswrapper[13984]: I0312 12:46:44.586281 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8085b96-6908-4956-9e4c-290797974f87" containerName="placement-db-sync" Mar 12 12:46:44.606389 master-0 kubenswrapper[13984]: I0312 12:46:44.587768 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.606389 master-0 kubenswrapper[13984]: I0312 12:46:44.594379 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 12 12:46:44.606389 master-0 kubenswrapper[13984]: I0312 12:46:44.596559 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 12 12:46:44.606389 master-0 kubenswrapper[13984]: I0312 12:46:44.596801 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 12 12:46:44.606389 master-0 kubenswrapper[13984]: I0312 12:46:44.597285 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 12 12:46:44.685719 master-0 kubenswrapper[13984]: I0312 12:46:44.685650 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cbb6f6f66-dz95j"] Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.783872 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-public-tls-certs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.783938 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-combined-ca-bundle\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.783984 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa87d2e-769b-4517-a700-73d24e233846-logs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.784033 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-internal-tls-certs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.784076 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-config-data\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.784124 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnqqt\" (UniqueName: \"kubernetes.io/projected/efa87d2e-769b-4517-a700-73d24e233846-kube-api-access-xnqqt\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.785548 master-0 kubenswrapper[13984]: I0312 12:46:44.784226 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-scripts\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.887628 master-0 kubenswrapper[13984]: I0312 12:46:44.886981 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-scripts\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.887816 master-0 kubenswrapper[13984]: I0312 12:46:44.887742 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-public-tls-certs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.887816 master-0 kubenswrapper[13984]: I0312 12:46:44.887792 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-combined-ca-bundle\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.887934 master-0 kubenswrapper[13984]: I0312 12:46:44.887859 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa87d2e-769b-4517-a700-73d24e233846-logs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.887934 master-0 kubenswrapper[13984]: I0312 12:46:44.887927 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-internal-tls-certs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.888031 master-0 kubenswrapper[13984]: I0312 12:46:44.887989 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-config-data\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.888085 master-0 kubenswrapper[13984]: I0312 12:46:44.888057 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnqqt\" (UniqueName: \"kubernetes.io/projected/efa87d2e-769b-4517-a700-73d24e233846-kube-api-access-xnqqt\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.888963 master-0 kubenswrapper[13984]: I0312 12:46:44.888917 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa87d2e-769b-4517-a700-73d24e233846-logs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.893401 master-0 kubenswrapper[13984]: I0312 12:46:44.893350 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-combined-ca-bundle\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.893589 master-0 kubenswrapper[13984]: I0312 12:46:44.893403 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-scripts\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.894933 master-0 kubenswrapper[13984]: I0312 12:46:44.894907 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-config-data\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.899903 master-0 kubenswrapper[13984]: I0312 12:46:44.899861 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-public-tls-certs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.904247 master-0 kubenswrapper[13984]: I0312 12:46:44.903765 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-internal-tls-certs\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.940613 master-0 kubenswrapper[13984]: I0312 12:46:44.937631 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnqqt\" (UniqueName: \"kubernetes.io/projected/efa87d2e-769b-4517-a700-73d24e233846-kube-api-access-xnqqt\") pod \"placement-5cbb6f6f66-dz95j\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:44.997082 master-0 kubenswrapper[13984]: I0312 12:46:44.997017 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:45.125392 master-0 kubenswrapper[13984]: I0312 12:46:45.125337 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:46:45.285576 master-0 kubenswrapper[13984]: I0312 12:46:45.283503 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:45.404170 master-0 kubenswrapper[13984]: I0312 12:46:45.403062 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-credential-keys\") pod \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " Mar 12 12:46:45.404170 master-0 kubenswrapper[13984]: I0312 12:46:45.403175 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-fernet-keys\") pod \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " Mar 12 12:46:45.404170 master-0 kubenswrapper[13984]: I0312 12:46:45.403220 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-config-data\") pod \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " Mar 12 12:46:45.404170 master-0 kubenswrapper[13984]: I0312 12:46:45.403285 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-scripts\") pod \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " Mar 12 12:46:45.404170 master-0 kubenswrapper[13984]: I0312 12:46:45.403302 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgptq\" (UniqueName: \"kubernetes.io/projected/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-kube-api-access-mgptq\") pod \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " Mar 12 12:46:45.404170 master-0 kubenswrapper[13984]: I0312 12:46:45.403363 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-combined-ca-bundle\") pod \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\" (UID: \"dc6207a9-cfba-4912-9ee2-7ae929ac28bd\") " Mar 12 12:46:45.413054 master-0 kubenswrapper[13984]: I0312 12:46:45.412994 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-scripts" (OuterVolumeSpecName: "scripts") pod "dc6207a9-cfba-4912-9ee2-7ae929ac28bd" (UID: "dc6207a9-cfba-4912-9ee2-7ae929ac28bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:45.414560 master-0 kubenswrapper[13984]: I0312 12:46:45.414497 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dc6207a9-cfba-4912-9ee2-7ae929ac28bd" (UID: "dc6207a9-cfba-4912-9ee2-7ae929ac28bd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:45.416178 master-0 kubenswrapper[13984]: I0312 12:46:45.416066 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-kube-api-access-mgptq" (OuterVolumeSpecName: "kube-api-access-mgptq") pod "dc6207a9-cfba-4912-9ee2-7ae929ac28bd" (UID: "dc6207a9-cfba-4912-9ee2-7ae929ac28bd"). InnerVolumeSpecName "kube-api-access-mgptq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:45.416409 master-0 kubenswrapper[13984]: I0312 12:46:45.416364 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dc6207a9-cfba-4912-9ee2-7ae929ac28bd" (UID: "dc6207a9-cfba-4912-9ee2-7ae929ac28bd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:45.439062 master-0 kubenswrapper[13984]: I0312 12:46:45.439007 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-config-data" (OuterVolumeSpecName: "config-data") pod "dc6207a9-cfba-4912-9ee2-7ae929ac28bd" (UID: "dc6207a9-cfba-4912-9ee2-7ae929ac28bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:45.441930 master-0 kubenswrapper[13984]: I0312 12:46:45.441861 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc6207a9-cfba-4912-9ee2-7ae929ac28bd" (UID: "dc6207a9-cfba-4912-9ee2-7ae929ac28bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:45.511519 master-0 kubenswrapper[13984]: I0312 12:46:45.511436 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:45.511669 master-0 kubenswrapper[13984]: I0312 12:46:45.511527 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgptq\" (UniqueName: \"kubernetes.io/projected/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-kube-api-access-mgptq\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:45.511669 master-0 kubenswrapper[13984]: I0312 12:46:45.511547 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:45.511669 master-0 kubenswrapper[13984]: I0312 12:46:45.511594 13984 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:45.511669 master-0 kubenswrapper[13984]: I0312 12:46:45.511610 13984 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:45.511669 master-0 kubenswrapper[13984]: I0312 12:46:45.511626 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc6207a9-cfba-4912-9ee2-7ae929ac28bd-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:45.581501 master-0 kubenswrapper[13984]: I0312 12:46:45.581401 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5cbb6f6f66-dz95j"] Mar 12 12:46:45.594179 master-0 kubenswrapper[13984]: W0312 12:46:45.594133 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefa87d2e_769b_4517_a700_73d24e233846.slice/crio-47ce580d81fbd81c5739b622e0cc01c252f290ff92c766dd388f83489dd36010 WatchSource:0}: Error finding container 47ce580d81fbd81c5739b622e0cc01c252f290ff92c766dd388f83489dd36010: Status 404 returned error can't find the container with id 47ce580d81fbd81c5739b622e0cc01c252f290ff92c766dd388f83489dd36010 Mar 12 12:46:45.821983 master-0 kubenswrapper[13984]: I0312 12:46:45.821898 13984 generic.go:334] "Generic (PLEG): container finished" podID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerID="2fe9b4fc783a87e54e2961785e872cb7ebda82067cf0a46abee2334e1c183281" exitCode=0 Mar 12 12:46:45.822098 master-0 kubenswrapper[13984]: I0312 12:46:45.822016 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-ftw7v" event={"ID":"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3","Type":"ContainerDied","Data":"2fe9b4fc783a87e54e2961785e872cb7ebda82067cf0a46abee2334e1c183281"} Mar 12 12:46:45.831860 master-0 kubenswrapper[13984]: I0312 12:46:45.831667 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"920ddbc6-460a-455d-b6eb-8a9393dc169e","Type":"ContainerStarted","Data":"b67f129a214b57aa3a755f00afafecf41db3a50dc3fd55d84439913149e59cef"} Mar 12 12:46:45.840621 master-0 kubenswrapper[13984]: I0312 12:46:45.840499 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-p7rh2" event={"ID":"dc6207a9-cfba-4912-9ee2-7ae929ac28bd","Type":"ContainerDied","Data":"e50d2b4578b50d9bbfa44a5860df051a73ecce925ba13678d870837a7693fa75"} Mar 12 12:46:45.840621 master-0 kubenswrapper[13984]: I0312 12:46:45.840575 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e50d2b4578b50d9bbfa44a5860df051a73ecce925ba13678d870837a7693fa75" Mar 12 12:46:45.840723 master-0 kubenswrapper[13984]: I0312 12:46:45.840708 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-p7rh2" Mar 12 12:46:45.850769 master-0 kubenswrapper[13984]: I0312 12:46:45.850639 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb6f6f66-dz95j" event={"ID":"efa87d2e-769b-4517-a700-73d24e233846","Type":"ContainerStarted","Data":"47ce580d81fbd81c5739b622e0cc01c252f290ff92c766dd388f83489dd36010"} Mar 12 12:46:45.856830 master-0 kubenswrapper[13984]: I0312 12:46:45.854935 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"4b525941-42e8-455b-b117-fae2d47b7977","Type":"ContainerStarted","Data":"5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e"} Mar 12 12:46:46.053259 master-0 kubenswrapper[13984]: I0312 12:46:46.053204 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7d857f7b77-g6ptr"] Mar 12 12:46:46.054269 master-0 kubenswrapper[13984]: E0312 12:46:46.053673 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6207a9-cfba-4912-9ee2-7ae929ac28bd" containerName="keystone-bootstrap" Mar 12 12:46:46.054269 master-0 kubenswrapper[13984]: I0312 12:46:46.053702 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6207a9-cfba-4912-9ee2-7ae929ac28bd" containerName="keystone-bootstrap" Mar 12 12:46:46.054269 master-0 kubenswrapper[13984]: I0312 12:46:46.054015 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc6207a9-cfba-4912-9ee2-7ae929ac28bd" containerName="keystone-bootstrap" Mar 12 12:46:46.054847 master-0 kubenswrapper[13984]: I0312 12:46:46.054813 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d857f7b77-g6ptr"] Mar 12 12:46:46.054954 master-0 kubenswrapper[13984]: I0312 12:46:46.054927 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.059954 master-0 kubenswrapper[13984]: I0312 12:46:46.059606 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 12 12:46:46.061532 master-0 kubenswrapper[13984]: I0312 12:46:46.060940 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 12 12:46:46.061532 master-0 kubenswrapper[13984]: I0312 12:46:46.061109 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 12 12:46:46.061532 master-0 kubenswrapper[13984]: I0312 12:46:46.061130 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 12 12:46:46.061532 master-0 kubenswrapper[13984]: I0312 12:46:46.061433 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 12 12:46:46.226266 master-0 kubenswrapper[13984]: I0312 12:46:46.226212 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-internal-tls-certs\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226266 master-0 kubenswrapper[13984]: I0312 12:46:46.226265 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xnn\" (UniqueName: \"kubernetes.io/projected/f4c65284-94b1-4a57-9953-0b5f19554376-kube-api-access-n9xnn\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226587 master-0 kubenswrapper[13984]: I0312 12:46:46.226299 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-config-data\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226587 master-0 kubenswrapper[13984]: I0312 12:46:46.226327 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-credential-keys\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226587 master-0 kubenswrapper[13984]: I0312 12:46:46.226378 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-fernet-keys\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226587 master-0 kubenswrapper[13984]: I0312 12:46:46.226404 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-combined-ca-bundle\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226587 master-0 kubenswrapper[13984]: I0312 12:46:46.226439 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-scripts\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.226587 master-0 kubenswrapper[13984]: I0312 12:46:46.226460 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-public-tls-certs\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.328523 master-0 kubenswrapper[13984]: I0312 12:46:46.328382 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-internal-tls-certs\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.328523 master-0 kubenswrapper[13984]: I0312 12:46:46.328450 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xnn\" (UniqueName: \"kubernetes.io/projected/f4c65284-94b1-4a57-9953-0b5f19554376-kube-api-access-n9xnn\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.328850 master-0 kubenswrapper[13984]: I0312 12:46:46.328522 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-config-data\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.333589 master-0 kubenswrapper[13984]: I0312 12:46:46.329055 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-credential-keys\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.333589 master-0 kubenswrapper[13984]: I0312 12:46:46.329250 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-fernet-keys\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.333589 master-0 kubenswrapper[13984]: I0312 12:46:46.329317 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-combined-ca-bundle\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.333589 master-0 kubenswrapper[13984]: I0312 12:46:46.329397 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-scripts\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.333589 master-0 kubenswrapper[13984]: I0312 12:46:46.329430 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-public-tls-certs\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.333589 master-0 kubenswrapper[13984]: I0312 12:46:46.333427 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-fernet-keys\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.334028 master-0 kubenswrapper[13984]: I0312 12:46:46.333951 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-public-tls-certs\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.335061 master-0 kubenswrapper[13984]: I0312 12:46:46.334676 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-scripts\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.335061 master-0 kubenswrapper[13984]: I0312 12:46:46.334932 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-internal-tls-certs\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.335061 master-0 kubenswrapper[13984]: I0312 12:46:46.335026 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-config-data\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.337088 master-0 kubenswrapper[13984]: I0312 12:46:46.337028 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-credential-keys\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.338245 master-0 kubenswrapper[13984]: I0312 12:46:46.338209 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4c65284-94b1-4a57-9953-0b5f19554376-combined-ca-bundle\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.348662 master-0 kubenswrapper[13984]: I0312 12:46:46.348607 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xnn\" (UniqueName: \"kubernetes.io/projected/f4c65284-94b1-4a57-9953-0b5f19554376-kube-api-access-n9xnn\") pod \"keystone-7d857f7b77-g6ptr\" (UID: \"f4c65284-94b1-4a57-9953-0b5f19554376\") " pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.407182 master-0 kubenswrapper[13984]: I0312 12:46:46.404892 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:46.884555 master-0 kubenswrapper[13984]: I0312 12:46:46.881685 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb6f6f66-dz95j" event={"ID":"efa87d2e-769b-4517-a700-73d24e233846","Type":"ContainerStarted","Data":"4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae"} Mar 12 12:46:46.884555 master-0 kubenswrapper[13984]: I0312 12:46:46.881822 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb6f6f66-dz95j" event={"ID":"efa87d2e-769b-4517-a700-73d24e233846","Type":"ContainerStarted","Data":"20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f"} Mar 12 12:46:46.884555 master-0 kubenswrapper[13984]: I0312 12:46:46.882091 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:46.884555 master-0 kubenswrapper[13984]: I0312 12:46:46.882185 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:46:46.893505 master-0 kubenswrapper[13984]: I0312 12:46:46.893248 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"4b525941-42e8-455b-b117-fae2d47b7977","Type":"ContainerStarted","Data":"23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532"} Mar 12 12:46:46.903517 master-0 kubenswrapper[13984]: I0312 12:46:46.903351 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-ftw7v" event={"ID":"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3","Type":"ContainerStarted","Data":"090315b728384dca4a5ff387b80c6d14f3812cb087d7eb1f6b37c177b3518981"} Mar 12 12:46:46.909506 master-0 kubenswrapper[13984]: I0312 12:46:46.907127 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"920ddbc6-460a-455d-b6eb-8a9393dc169e","Type":"ContainerStarted","Data":"54414d412150bc428c6f0d18b48a1bd10be8e921597c44cbc9ca14d21b796c12"} Mar 12 12:46:46.909506 master-0 kubenswrapper[13984]: I0312 12:46:46.907174 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"920ddbc6-460a-455d-b6eb-8a9393dc169e","Type":"ContainerStarted","Data":"c1921e384336ad0fddcfac6b8330b593a2d4a272e635610aa7d493e8bde23d84"} Mar 12 12:46:46.936989 master-0 kubenswrapper[13984]: I0312 12:46:46.934457 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-db-sync-rcknr" event={"ID":"c180ef3d-38aa-4843-b682-92e609e6d856","Type":"ContainerStarted","Data":"013d81cfdee7bee90efff72ae769540222de088bf2428818912f4829a5a21aa9"} Mar 12 12:46:46.940163 master-0 kubenswrapper[13984]: I0312 12:46:46.938095 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5cbb6f6f66-dz95j" podStartSLOduration=2.9380759960000002 podStartE2EDuration="2.938075996s" podCreationTimestamp="2026-03-12 12:46:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:46.907345647 +0000 UTC m=+1339.105361139" watchObservedRunningTime="2026-03-12 12:46:46.938075996 +0000 UTC m=+1339.136091488" Mar 12 12:46:46.957618 master-0 kubenswrapper[13984]: I0312 12:46:46.957572 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7d857f7b77-g6ptr"] Mar 12 12:46:46.965386 master-0 kubenswrapper[13984]: I0312 12:46:46.965299 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-ftw7v" podStartSLOduration=4.585564736 podStartE2EDuration="22.965281973s" podCreationTimestamp="2026-03-12 12:46:24 +0000 UTC" firstStartedPulling="2026-03-12 12:46:26.098822484 +0000 UTC m=+1318.296837976" lastFinishedPulling="2026-03-12 12:46:44.478539721 +0000 UTC m=+1336.676555213" observedRunningTime="2026-03-12 12:46:46.943688238 +0000 UTC m=+1339.141703780" watchObservedRunningTime="2026-03-12 12:46:46.965281973 +0000 UTC m=+1339.163297465" Mar 12 12:46:46.996671 master-0 kubenswrapper[13984]: I0312 12:46:46.995230 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f98a5-default-external-api-0" podStartSLOduration=18.995209944 podStartE2EDuration="18.995209944s" podCreationTimestamp="2026-03-12 12:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:46.967444494 +0000 UTC m=+1339.165459986" watchObservedRunningTime="2026-03-12 12:46:46.995209944 +0000 UTC m=+1339.193225436" Mar 12 12:46:47.042154 master-0 kubenswrapper[13984]: I0312 12:46:47.042085 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f98a5-default-internal-api-0" podStartSLOduration=19.042065161 podStartE2EDuration="19.042065161s" podCreationTimestamp="2026-03-12 12:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:46.993942934 +0000 UTC m=+1339.191958426" watchObservedRunningTime="2026-03-12 12:46:47.042065161 +0000 UTC m=+1339.240080653" Mar 12 12:46:47.060912 master-0 kubenswrapper[13984]: I0312 12:46:47.060839 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-db-sync-rcknr" podStartSLOduration=4.932893193 podStartE2EDuration="33.06082068s" podCreationTimestamp="2026-03-12 12:46:14 +0000 UTC" firstStartedPulling="2026-03-12 12:46:16.591750549 +0000 UTC m=+1308.789766041" lastFinishedPulling="2026-03-12 12:46:44.719678036 +0000 UTC m=+1336.917693528" observedRunningTime="2026-03-12 12:46:47.018783026 +0000 UTC m=+1339.216798518" watchObservedRunningTime="2026-03-12 12:46:47.06082068 +0000 UTC m=+1339.258836172" Mar 12 12:46:47.953181 master-0 kubenswrapper[13984]: I0312 12:46:47.953090 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d857f7b77-g6ptr" event={"ID":"f4c65284-94b1-4a57-9953-0b5f19554376","Type":"ContainerStarted","Data":"8a79a1cd84a8067c2aeab9d27800b7a17dfec4d1d704932b11666a3b22f41f65"} Mar 12 12:46:47.953181 master-0 kubenswrapper[13984]: I0312 12:46:47.953146 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7d857f7b77-g6ptr" event={"ID":"f4c65284-94b1-4a57-9953-0b5f19554376","Type":"ContainerStarted","Data":"848e5d445d93f2fbaa393dbe33986bae9ef0dfbefb58b487c0eb755b47b58dad"} Mar 12 12:46:47.997586 master-0 kubenswrapper[13984]: I0312 12:46:47.997392 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7d857f7b77-g6ptr" podStartSLOduration=2.9973633939999997 podStartE2EDuration="2.997363394s" podCreationTimestamp="2026-03-12 12:46:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:47.978952343 +0000 UTC m=+1340.176967825" watchObservedRunningTime="2026-03-12 12:46:47.997363394 +0000 UTC m=+1340.195378906" Mar 12 12:46:48.962334 master-0 kubenswrapper[13984]: I0312 12:46:48.962260 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:46:50.510597 master-0 kubenswrapper[13984]: I0312 12:46:50.510525 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:50.511133 master-0 kubenswrapper[13984]: I0312 12:46:50.510614 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:50.548262 master-0 kubenswrapper[13984]: I0312 12:46:50.544080 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:50.557994 master-0 kubenswrapper[13984]: I0312 12:46:50.557930 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:50.981266 master-0 kubenswrapper[13984]: I0312 12:46:50.981205 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:50.981266 master-0 kubenswrapper[13984]: I0312 12:46:50.981245 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:51.745801 master-0 kubenswrapper[13984]: I0312 12:46:51.745729 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:51.748957 master-0 kubenswrapper[13984]: I0312 12:46:51.747927 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:51.792931 master-0 kubenswrapper[13984]: I0312 12:46:51.792887 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:51.795153 master-0 kubenswrapper[13984]: I0312 12:46:51.795080 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:51.992318 master-0 kubenswrapper[13984]: I0312 12:46:51.992275 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:51.992552 master-0 kubenswrapper[13984]: I0312 12:46:51.992441 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:53.002937 master-0 kubenswrapper[13984]: I0312 12:46:53.002880 13984 generic.go:334] "Generic (PLEG): container finished" podID="d2a33401-2a04-4b67-ad4f-702034d7a0a6" containerID="eae2a9932d934690d2763d8202034296c040081ae3224dcceab9627695765370" exitCode=0 Mar 12 12:46:53.003451 master-0 kubenswrapper[13984]: I0312 12:46:53.002976 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dfnvv" event={"ID":"d2a33401-2a04-4b67-ad4f-702034d7a0a6","Type":"ContainerDied","Data":"eae2a9932d934690d2763d8202034296c040081ae3224dcceab9627695765370"} Mar 12 12:46:53.550030 master-0 kubenswrapper[13984]: I0312 12:46:53.549944 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:54.015633 master-0 kubenswrapper[13984]: I0312 12:46:54.015579 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:46:54.015633 master-0 kubenswrapper[13984]: I0312 12:46:54.015619 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:46:54.131307 master-0 kubenswrapper[13984]: I0312 12:46:54.130957 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:46:54.302570 master-0 kubenswrapper[13984]: I0312 12:46:54.302431 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:54.311252 master-0 kubenswrapper[13984]: I0312 12:46:54.311200 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:46:54.486472 master-0 kubenswrapper[13984]: I0312 12:46:54.486347 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:54.574988 master-0 kubenswrapper[13984]: I0312 12:46:54.574832 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-config\") pod \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " Mar 12 12:46:54.574988 master-0 kubenswrapper[13984]: I0312 12:46:54.574936 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2lqs\" (UniqueName: \"kubernetes.io/projected/d2a33401-2a04-4b67-ad4f-702034d7a0a6-kube-api-access-v2lqs\") pod \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " Mar 12 12:46:54.575391 master-0 kubenswrapper[13984]: I0312 12:46:54.575073 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-combined-ca-bundle\") pod \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\" (UID: \"d2a33401-2a04-4b67-ad4f-702034d7a0a6\") " Mar 12 12:46:54.587831 master-0 kubenswrapper[13984]: I0312 12:46:54.587754 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2a33401-2a04-4b67-ad4f-702034d7a0a6-kube-api-access-v2lqs" (OuterVolumeSpecName: "kube-api-access-v2lqs") pod "d2a33401-2a04-4b67-ad4f-702034d7a0a6" (UID: "d2a33401-2a04-4b67-ad4f-702034d7a0a6"). InnerVolumeSpecName "kube-api-access-v2lqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:54.625581 master-0 kubenswrapper[13984]: I0312 12:46:54.621670 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d2a33401-2a04-4b67-ad4f-702034d7a0a6" (UID: "d2a33401-2a04-4b67-ad4f-702034d7a0a6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:54.627739 master-0 kubenswrapper[13984]: I0312 12:46:54.627675 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-config" (OuterVolumeSpecName: "config") pod "d2a33401-2a04-4b67-ad4f-702034d7a0a6" (UID: "d2a33401-2a04-4b67-ad4f-702034d7a0a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:54.680003 master-0 kubenswrapper[13984]: I0312 12:46:54.679704 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:54.680003 master-0 kubenswrapper[13984]: I0312 12:46:54.679775 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2lqs\" (UniqueName: \"kubernetes.io/projected/d2a33401-2a04-4b67-ad4f-702034d7a0a6-kube-api-access-v2lqs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:54.680003 master-0 kubenswrapper[13984]: I0312 12:46:54.679793 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a33401-2a04-4b67-ad4f-702034d7a0a6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:55.027137 master-0 kubenswrapper[13984]: I0312 12:46:55.027085 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-dfnvv" Mar 12 12:46:55.028633 master-0 kubenswrapper[13984]: I0312 12:46:55.028585 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-dfnvv" event={"ID":"d2a33401-2a04-4b67-ad4f-702034d7a0a6","Type":"ContainerDied","Data":"ed88a900d1f45b8fa141c42ffed42a89b0963beb81ed1c1fd3fcc1f326c15df0"} Mar 12 12:46:55.028633 master-0 kubenswrapper[13984]: I0312 12:46:55.028642 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed88a900d1f45b8fa141c42ffed42a89b0963beb81ed1c1fd3fcc1f326c15df0" Mar 12 12:46:55.398003 master-0 kubenswrapper[13984]: I0312 12:46:55.396676 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bd64f6f77-8vlbv"] Mar 12 12:46:55.398003 master-0 kubenswrapper[13984]: E0312 12:46:55.397192 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2a33401-2a04-4b67-ad4f-702034d7a0a6" containerName="neutron-db-sync" Mar 12 12:46:55.398003 master-0 kubenswrapper[13984]: I0312 12:46:55.397205 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2a33401-2a04-4b67-ad4f-702034d7a0a6" containerName="neutron-db-sync" Mar 12 12:46:55.398003 master-0 kubenswrapper[13984]: I0312 12:46:55.397472 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2a33401-2a04-4b67-ad4f-702034d7a0a6" containerName="neutron-db-sync" Mar 12 12:46:55.399569 master-0 kubenswrapper[13984]: I0312 12:46:55.398699 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.421048 master-0 kubenswrapper[13984]: I0312 12:46:55.420814 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd64f6f77-8vlbv"] Mar 12 12:46:55.511191 master-0 kubenswrapper[13984]: I0312 12:46:55.511119 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-sb\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.511810 master-0 kubenswrapper[13984]: I0312 12:46:55.511263 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-nb\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.511810 master-0 kubenswrapper[13984]: I0312 12:46:55.511293 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-config\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.516145 master-0 kubenswrapper[13984]: I0312 12:46:55.516054 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-svc\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.516586 master-0 kubenswrapper[13984]: I0312 12:46:55.516286 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-swift-storage-0\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.516586 master-0 kubenswrapper[13984]: I0312 12:46:55.516331 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhllj\" (UniqueName: \"kubernetes.io/projected/4ea6d788-25be-45ad-8bb5-b67fa78befb5-kube-api-access-lhllj\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.594613 master-0 kubenswrapper[13984]: I0312 12:46:55.592519 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7f777f5fbd-l96tl"] Mar 12 12:46:55.604639 master-0 kubenswrapper[13984]: I0312 12:46:55.603811 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.607809 master-0 kubenswrapper[13984]: I0312 12:46:55.607653 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 12 12:46:55.608495 master-0 kubenswrapper[13984]: I0312 12:46:55.607974 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 12 12:46:55.608495 master-0 kubenswrapper[13984]: I0312 12:46:55.608113 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.617894 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-httpd-config\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.617942 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92n8z\" (UniqueName: \"kubernetes.io/projected/91599efa-a74b-48bc-8553-58d2c33e1e75-kube-api-access-92n8z\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.617962 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-combined-ca-bundle\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.617994 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-nb\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618014 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-config\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618059 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-svc\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618118 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-swift-storage-0\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618142 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhllj\" (UniqueName: \"kubernetes.io/projected/4ea6d788-25be-45ad-8bb5-b67fa78befb5-kube-api-access-lhllj\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618184 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-ovndb-tls-certs\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618225 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-sb\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.618242 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-config\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.619070 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-nb\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.619615 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-config\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.620378 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-svc\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.620962 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-sb\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.622537 master-0 kubenswrapper[13984]: I0312 12:46:55.621548 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-swift-storage-0\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.637851 master-0 kubenswrapper[13984]: I0312 12:46:55.637725 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f777f5fbd-l96tl"] Mar 12 12:46:55.662689 master-0 kubenswrapper[13984]: I0312 12:46:55.662572 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhllj\" (UniqueName: \"kubernetes.io/projected/4ea6d788-25be-45ad-8bb5-b67fa78befb5-kube-api-access-lhllj\") pod \"dnsmasq-dns-bd64f6f77-8vlbv\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.722630 master-0 kubenswrapper[13984]: I0312 12:46:55.721611 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-ovndb-tls-certs\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.722630 master-0 kubenswrapper[13984]: I0312 12:46:55.721705 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-config\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.722630 master-0 kubenswrapper[13984]: I0312 12:46:55.721771 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-httpd-config\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.722630 master-0 kubenswrapper[13984]: I0312 12:46:55.721800 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92n8z\" (UniqueName: \"kubernetes.io/projected/91599efa-a74b-48bc-8553-58d2c33e1e75-kube-api-access-92n8z\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.722630 master-0 kubenswrapper[13984]: I0312 12:46:55.721822 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-combined-ca-bundle\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.732150 master-0 kubenswrapper[13984]: I0312 12:46:55.726715 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-combined-ca-bundle\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.732150 master-0 kubenswrapper[13984]: I0312 12:46:55.727583 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-config\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.732150 master-0 kubenswrapper[13984]: I0312 12:46:55.728296 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-ovndb-tls-certs\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.732150 master-0 kubenswrapper[13984]: I0312 12:46:55.729104 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-httpd-config\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.750518 master-0 kubenswrapper[13984]: I0312 12:46:55.750413 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92n8z\" (UniqueName: \"kubernetes.io/projected/91599efa-a74b-48bc-8553-58d2c33e1e75-kube-api-access-92n8z\") pod \"neutron-7f777f5fbd-l96tl\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:55.780124 master-0 kubenswrapper[13984]: I0312 12:46:55.780047 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:55.952111 master-0 kubenswrapper[13984]: I0312 12:46:55.951518 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:56.394494 master-0 kubenswrapper[13984]: I0312 12:46:56.394276 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bd64f6f77-8vlbv"] Mar 12 12:46:56.666432 master-0 kubenswrapper[13984]: W0312 12:46:56.666382 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod91599efa_a74b_48bc_8553_58d2c33e1e75.slice/crio-1cc2927014a05f3bc3e8b79591ec5ac31fe96ee1adf0d1cea15fad6b29e8653b WatchSource:0}: Error finding container 1cc2927014a05f3bc3e8b79591ec5ac31fe96ee1adf0d1cea15fad6b29e8653b: Status 404 returned error can't find the container with id 1cc2927014a05f3bc3e8b79591ec5ac31fe96ee1adf0d1cea15fad6b29e8653b Mar 12 12:46:56.672608 master-0 kubenswrapper[13984]: I0312 12:46:56.672425 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7f777f5fbd-l96tl"] Mar 12 12:46:57.083005 master-0 kubenswrapper[13984]: I0312 12:46:57.082844 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f777f5fbd-l96tl" event={"ID":"91599efa-a74b-48bc-8553-58d2c33e1e75","Type":"ContainerStarted","Data":"1cc2927014a05f3bc3e8b79591ec5ac31fe96ee1adf0d1cea15fad6b29e8653b"} Mar 12 12:46:57.106823 master-0 kubenswrapper[13984]: I0312 12:46:57.105663 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" event={"ID":"4ea6d788-25be-45ad-8bb5-b67fa78befb5","Type":"ContainerDied","Data":"2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a"} Mar 12 12:46:57.106823 master-0 kubenswrapper[13984]: I0312 12:46:57.104236 13984 generic.go:334] "Generic (PLEG): container finished" podID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerID="2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a" exitCode=0 Mar 12 12:46:57.106823 master-0 kubenswrapper[13984]: I0312 12:46:57.105805 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" event={"ID":"4ea6d788-25be-45ad-8bb5-b67fa78befb5","Type":"ContainerStarted","Data":"bb0f2425bfa46cee451876593032e5fcda566bf4da64a08cf8b39bd1beeb8a91"} Mar 12 12:46:57.121029 master-0 kubenswrapper[13984]: I0312 12:46:57.120964 13984 generic.go:334] "Generic (PLEG): container finished" podID="c180ef3d-38aa-4843-b682-92e609e6d856" containerID="013d81cfdee7bee90efff72ae769540222de088bf2428818912f4829a5a21aa9" exitCode=0 Mar 12 12:46:57.121029 master-0 kubenswrapper[13984]: I0312 12:46:57.121014 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-db-sync-rcknr" event={"ID":"c180ef3d-38aa-4843-b682-92e609e6d856","Type":"ContainerDied","Data":"013d81cfdee7bee90efff72ae769540222de088bf2428818912f4829a5a21aa9"} Mar 12 12:46:58.139836 master-0 kubenswrapper[13984]: I0312 12:46:58.139781 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f777f5fbd-l96tl" event={"ID":"91599efa-a74b-48bc-8553-58d2c33e1e75","Type":"ContainerStarted","Data":"2920d12c181fbe99698f1a8120e7ea3dabc0637c05f3659d64093bb6d59a8101"} Mar 12 12:46:58.139836 master-0 kubenswrapper[13984]: I0312 12:46:58.139841 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f777f5fbd-l96tl" event={"ID":"91599efa-a74b-48bc-8553-58d2c33e1e75","Type":"ContainerStarted","Data":"53ea1660947e8524c6066f7f907dc7ab036b6030e398ed32f2b7a1b487336f0f"} Mar 12 12:46:58.140468 master-0 kubenswrapper[13984]: I0312 12:46:58.139924 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:46:58.144165 master-0 kubenswrapper[13984]: I0312 12:46:58.143838 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" event={"ID":"4ea6d788-25be-45ad-8bb5-b67fa78befb5","Type":"ContainerStarted","Data":"eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6"} Mar 12 12:46:58.144165 master-0 kubenswrapper[13984]: I0312 12:46:58.143893 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:46:58.182165 master-0 kubenswrapper[13984]: I0312 12:46:58.182074 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7f777f5fbd-l96tl" podStartSLOduration=3.1820495 podStartE2EDuration="3.1820495s" podCreationTimestamp="2026-03-12 12:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:58.166784782 +0000 UTC m=+1350.364800274" watchObservedRunningTime="2026-03-12 12:46:58.1820495 +0000 UTC m=+1350.380064992" Mar 12 12:46:58.213161 master-0 kubenswrapper[13984]: I0312 12:46:58.213058 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" podStartSLOduration=3.213036465 podStartE2EDuration="3.213036465s" podCreationTimestamp="2026-03-12 12:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:46:58.201751461 +0000 UTC m=+1350.399766953" watchObservedRunningTime="2026-03-12 12:46:58.213036465 +0000 UTC m=+1350.411051967" Mar 12 12:46:58.451640 master-0 kubenswrapper[13984]: I0312 12:46:58.451589 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6dc97885fc-g6g5j"] Mar 12 12:46:58.453338 master-0 kubenswrapper[13984]: I0312 12:46:58.453308 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.455212 master-0 kubenswrapper[13984]: I0312 12:46:58.455150 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 12 12:46:58.455327 master-0 kubenswrapper[13984]: I0312 12:46:58.455313 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 12 12:46:58.465591 master-0 kubenswrapper[13984]: I0312 12:46:58.465530 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dc97885fc-g6g5j"] Mar 12 12:46:58.605998 master-0 kubenswrapper[13984]: I0312 12:46:58.605939 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-combined-ca-bundle\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.606837 master-0 kubenswrapper[13984]: I0312 12:46:58.606783 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6fc\" (UniqueName: \"kubernetes.io/projected/f399c312-60ae-457b-b49e-481c112547d1-kube-api-access-kx6fc\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.607116 master-0 kubenswrapper[13984]: I0312 12:46:58.607068 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-httpd-config\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.607215 master-0 kubenswrapper[13984]: I0312 12:46:58.607126 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-config\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.607215 master-0 kubenswrapper[13984]: I0312 12:46:58.607151 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-internal-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.607215 master-0 kubenswrapper[13984]: I0312 12:46:58.607202 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-ovndb-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.607428 master-0 kubenswrapper[13984]: I0312 12:46:58.607311 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-public-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.705844 master-0 kubenswrapper[13984]: I0312 12:46:58.705730 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:58.711674 master-0 kubenswrapper[13984]: I0312 12:46:58.711590 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-db-sync-config-data\") pod \"c180ef3d-38aa-4843-b682-92e609e6d856\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " Mar 12 12:46:58.711674 master-0 kubenswrapper[13984]: I0312 12:46:58.711638 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-scripts\") pod \"c180ef3d-38aa-4843-b682-92e609e6d856\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " Mar 12 12:46:58.712017 master-0 kubenswrapper[13984]: I0312 12:46:58.711689 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-config-data\") pod \"c180ef3d-38aa-4843-b682-92e609e6d856\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " Mar 12 12:46:58.712017 master-0 kubenswrapper[13984]: I0312 12:46:58.711751 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c180ef3d-38aa-4843-b682-92e609e6d856-etc-machine-id\") pod \"c180ef3d-38aa-4843-b682-92e609e6d856\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " Mar 12 12:46:58.712017 master-0 kubenswrapper[13984]: I0312 12:46:58.711841 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-combined-ca-bundle\") pod \"c180ef3d-38aa-4843-b682-92e609e6d856\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " Mar 12 12:46:58.712017 master-0 kubenswrapper[13984]: I0312 12:46:58.711923 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rkw\" (UniqueName: \"kubernetes.io/projected/c180ef3d-38aa-4843-b682-92e609e6d856-kube-api-access-n4rkw\") pod \"c180ef3d-38aa-4843-b682-92e609e6d856\" (UID: \"c180ef3d-38aa-4843-b682-92e609e6d856\") " Mar 12 12:46:58.712274 master-0 kubenswrapper[13984]: I0312 12:46:58.712175 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-internal-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712274 master-0 kubenswrapper[13984]: I0312 12:46:58.712202 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-config\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712274 master-0 kubenswrapper[13984]: I0312 12:46:58.712229 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-ovndb-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712274 master-0 kubenswrapper[13984]: I0312 12:46:58.712271 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-public-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712497 master-0 kubenswrapper[13984]: I0312 12:46:58.712372 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-combined-ca-bundle\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712497 master-0 kubenswrapper[13984]: I0312 12:46:58.712419 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kx6fc\" (UniqueName: \"kubernetes.io/projected/f399c312-60ae-457b-b49e-481c112547d1-kube-api-access-kx6fc\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712497 master-0 kubenswrapper[13984]: I0312 12:46:58.712452 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-httpd-config\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.712759 master-0 kubenswrapper[13984]: I0312 12:46:58.712673 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c180ef3d-38aa-4843-b682-92e609e6d856-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c180ef3d-38aa-4843-b682-92e609e6d856" (UID: "c180ef3d-38aa-4843-b682-92e609e6d856"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:46:58.717145 master-0 kubenswrapper[13984]: I0312 12:46:58.717101 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c180ef3d-38aa-4843-b682-92e609e6d856" (UID: "c180ef3d-38aa-4843-b682-92e609e6d856"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:58.717145 master-0 kubenswrapper[13984]: I0312 12:46:58.717168 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c180ef3d-38aa-4843-b682-92e609e6d856-kube-api-access-n4rkw" (OuterVolumeSpecName: "kube-api-access-n4rkw") pod "c180ef3d-38aa-4843-b682-92e609e6d856" (UID: "c180ef3d-38aa-4843-b682-92e609e6d856"). InnerVolumeSpecName "kube-api-access-n4rkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:46:58.720203 master-0 kubenswrapper[13984]: I0312 12:46:58.720124 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-httpd-config\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.720757 master-0 kubenswrapper[13984]: I0312 12:46:58.720706 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-scripts" (OuterVolumeSpecName: "scripts") pod "c180ef3d-38aa-4843-b682-92e609e6d856" (UID: "c180ef3d-38aa-4843-b682-92e609e6d856"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:58.724144 master-0 kubenswrapper[13984]: I0312 12:46:58.723667 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-ovndb-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.732853 master-0 kubenswrapper[13984]: I0312 12:46:58.732784 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-internal-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.742470 master-0 kubenswrapper[13984]: I0312 12:46:58.742413 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-public-tls-certs\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.755382 master-0 kubenswrapper[13984]: I0312 12:46:58.755320 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-config\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.762358 master-0 kubenswrapper[13984]: I0312 12:46:58.762307 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f399c312-60ae-457b-b49e-481c112547d1-combined-ca-bundle\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.764513 master-0 kubenswrapper[13984]: I0312 12:46:58.764428 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx6fc\" (UniqueName: \"kubernetes.io/projected/f399c312-60ae-457b-b49e-481c112547d1-kube-api-access-kx6fc\") pod \"neutron-6dc97885fc-g6g5j\" (UID: \"f399c312-60ae-457b-b49e-481c112547d1\") " pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.786960 master-0 kubenswrapper[13984]: I0312 12:46:58.786888 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c180ef3d-38aa-4843-b682-92e609e6d856" (UID: "c180ef3d-38aa-4843-b682-92e609e6d856"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:58.808974 master-0 kubenswrapper[13984]: I0312 12:46:58.808906 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:46:58.815617 master-0 kubenswrapper[13984]: I0312 12:46:58.815525 13984 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:58.815617 master-0 kubenswrapper[13984]: I0312 12:46:58.815568 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:58.815617 master-0 kubenswrapper[13984]: I0312 12:46:58.815578 13984 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c180ef3d-38aa-4843-b682-92e609e6d856-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:58.815617 master-0 kubenswrapper[13984]: I0312 12:46:58.815586 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:58.815617 master-0 kubenswrapper[13984]: I0312 12:46:58.815596 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rkw\" (UniqueName: \"kubernetes.io/projected/c180ef3d-38aa-4843-b682-92e609e6d856-kube-api-access-n4rkw\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:58.828750 master-0 kubenswrapper[13984]: I0312 12:46:58.828662 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-config-data" (OuterVolumeSpecName: "config-data") pod "c180ef3d-38aa-4843-b682-92e609e6d856" (UID: "c180ef3d-38aa-4843-b682-92e609e6d856"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:46:58.941504 master-0 kubenswrapper[13984]: I0312 12:46:58.931896 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c180ef3d-38aa-4843-b682-92e609e6d856-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:46:59.165539 master-0 kubenswrapper[13984]: I0312 12:46:59.165437 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-db-sync-rcknr" event={"ID":"c180ef3d-38aa-4843-b682-92e609e6d856","Type":"ContainerDied","Data":"960a460dc98ea51aaa42d8a75b7846865b7660301833ea5b615d33f2040563bb"} Mar 12 12:46:59.165539 master-0 kubenswrapper[13984]: I0312 12:46:59.165487 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="960a460dc98ea51aaa42d8a75b7846865b7660301833ea5b615d33f2040563bb" Mar 12 12:46:59.165539 master-0 kubenswrapper[13984]: I0312 12:46:59.165500 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-db-sync-rcknr" Mar 12 12:46:59.377177 master-0 kubenswrapper[13984]: W0312 12:46:59.377116 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf399c312_60ae_457b_b49e_481c112547d1.slice/crio-96c0162587e954a012d1dc3cf6c59f1692c4841f4206fdcb3b399a6067682e79 WatchSource:0}: Error finding container 96c0162587e954a012d1dc3cf6c59f1692c4841f4206fdcb3b399a6067682e79: Status 404 returned error can't find the container with id 96c0162587e954a012d1dc3cf6c59f1692c4841f4206fdcb3b399a6067682e79 Mar 12 12:46:59.380554 master-0 kubenswrapper[13984]: I0312 12:46:59.379338 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6dc97885fc-g6g5j"] Mar 12 12:46:59.610511 master-0 kubenswrapper[13984]: I0312 12:46:59.605726 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:46:59.610511 master-0 kubenswrapper[13984]: E0312 12:46:59.606231 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c180ef3d-38aa-4843-b682-92e609e6d856" containerName="cinder-8c9c7-db-sync" Mar 12 12:46:59.610511 master-0 kubenswrapper[13984]: I0312 12:46:59.606246 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="c180ef3d-38aa-4843-b682-92e609e6d856" containerName="cinder-8c9c7-db-sync" Mar 12 12:46:59.610511 master-0 kubenswrapper[13984]: I0312 12:46:59.606471 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="c180ef3d-38aa-4843-b682-92e609e6d856" containerName="cinder-8c9c7-db-sync" Mar 12 12:46:59.610511 master-0 kubenswrapper[13984]: I0312 12:46:59.607518 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.616523 master-0 kubenswrapper[13984]: I0312 12:46:59.615499 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-scheduler-config-data" Mar 12 12:46:59.616523 master-0 kubenswrapper[13984]: I0312 12:46:59.615597 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-config-data" Mar 12 12:46:59.616523 master-0 kubenswrapper[13984]: I0312 12:46:59.615758 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-scripts" Mar 12 12:46:59.646198 master-0 kubenswrapper[13984]: I0312 12:46:59.646111 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693436 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c0e883-944a-444f-ad81-f88bb8dec00e-etc-machine-id\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693512 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data-custom\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693585 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-combined-ca-bundle\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693604 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntmgk\" (UniqueName: \"kubernetes.io/projected/13c0e883-944a-444f-ad81-f88bb8dec00e-kube-api-access-ntmgk\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693665 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693705 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-scripts\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.698675 master-0 kubenswrapper[13984]: I0312 12:46:59.693881 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:46:59.731616 master-0 kubenswrapper[13984]: I0312 12:46:59.726150 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.750752 master-0 kubenswrapper[13984]: I0312 12:46:59.749906 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-volume-lvm-iscsi-config-data" Mar 12 12:46:59.799036 master-0 kubenswrapper[13984]: I0312 12:46:59.797138 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c0e883-944a-444f-ad81-f88bb8dec00e-etc-machine-id\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.799036 master-0 kubenswrapper[13984]: I0312 12:46:59.797224 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data-custom\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.799036 master-0 kubenswrapper[13984]: I0312 12:46:59.797300 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-combined-ca-bundle\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.799036 master-0 kubenswrapper[13984]: I0312 12:46:59.797340 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntmgk\" (UniqueName: \"kubernetes.io/projected/13c0e883-944a-444f-ad81-f88bb8dec00e-kube-api-access-ntmgk\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.799036 master-0 kubenswrapper[13984]: I0312 12:46:59.797425 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.799391 master-0 kubenswrapper[13984]: I0312 12:46:59.799140 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-scripts\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.801784 master-0 kubenswrapper[13984]: I0312 12:46:59.799955 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c0e883-944a-444f-ad81-f88bb8dec00e-etc-machine-id\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.809523 master-0 kubenswrapper[13984]: I0312 12:46:59.807609 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data-custom\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.818973 master-0 kubenswrapper[13984]: I0312 12:46:59.812862 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-combined-ca-bundle\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.841755 master-0 kubenswrapper[13984]: I0312 12:46:59.841272 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntmgk\" (UniqueName: \"kubernetes.io/projected/13c0e883-944a-444f-ad81-f88bb8dec00e-kube-api-access-ntmgk\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.858500 master-0 kubenswrapper[13984]: I0312 12:46:59.856805 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-scripts\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.878507 master-0 kubenswrapper[13984]: I0312 12:46:59.877711 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:46:59.884157 master-0 kubenswrapper[13984]: I0312 12:46:59.882573 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.904994 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-combined-ca-bundle\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905298 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-brick\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905516 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84fm8\" (UniqueName: \"kubernetes.io/projected/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-kube-api-access-84fm8\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905559 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-dev\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905630 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905648 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-run\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905678 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-lib-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905744 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-machine-id\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905763 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-nvme\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905816 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data-custom\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905838 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-sys\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905915 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-lib-modules\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905976 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-iscsi\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.905996 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-scripts\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.906298 master-0 kubenswrapper[13984]: I0312 12:46:59.906050 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:46:59.931004 master-0 kubenswrapper[13984]: I0312 12:46:59.930569 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd64f6f77-8vlbv"] Mar 12 12:46:59.941619 master-0 kubenswrapper[13984]: I0312 12:46:59.937986 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:46:59.945814 master-0 kubenswrapper[13984]: I0312 12:46:59.942577 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:46:59.945814 master-0 kubenswrapper[13984]: I0312 12:46:59.944692 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:46:59.959637 master-0 kubenswrapper[13984]: I0312 12:46:59.956890 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-backup-config-data" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009166 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-machine-id\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009243 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-nvme\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009332 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data-custom\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009340 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-machine-id\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009493 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-sys\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009544 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-lib-modules\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009556 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-nvme\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009594 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-sys\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009721 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-scripts\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009755 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-lib-modules\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009776 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-iscsi\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009808 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009874 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-combined-ca-bundle\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009898 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-brick\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009944 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84fm8\" (UniqueName: \"kubernetes.io/projected/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-kube-api-access-84fm8\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009969 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-dev\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.009998 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.010011 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-run\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.010500 master-0 kubenswrapper[13984]: I0312 12:47:00.010047 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-lib-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.011385 master-0 kubenswrapper[13984]: I0312 12:47:00.010573 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-iscsi\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.017505 master-0 kubenswrapper[13984]: I0312 12:47:00.012096 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-dev\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.017505 master-0 kubenswrapper[13984]: I0312 12:47:00.012347 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-run\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.017505 master-0 kubenswrapper[13984]: I0312 12:47:00.013338 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-brick\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.017505 master-0 kubenswrapper[13984]: I0312 12:47:00.013648 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.017505 master-0 kubenswrapper[13984]: I0312 12:47:00.014213 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-lib-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.033523 master-0 kubenswrapper[13984]: I0312 12:47:00.023447 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-649858dd65-skgqw"] Mar 12 12:47:00.033523 master-0 kubenswrapper[13984]: I0312 12:47:00.024232 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-scripts\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.033523 master-0 kubenswrapper[13984]: I0312 12:47:00.026687 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.044780 master-0 kubenswrapper[13984]: I0312 12:47:00.044359 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-combined-ca-bundle\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.044780 master-0 kubenswrapper[13984]: I0312 12:47:00.044698 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data-custom\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.062660 master-0 kubenswrapper[13984]: I0312 12:47:00.045321 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:00.062660 master-0 kubenswrapper[13984]: I0312 12:47:00.045532 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.062660 master-0 kubenswrapper[13984]: I0312 12:47:00.046793 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84fm8\" (UniqueName: \"kubernetes.io/projected/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-kube-api-access-84fm8\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.072501 master-0 kubenswrapper[13984]: I0312 12:47:00.070839 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649858dd65-skgqw"] Mar 12 12:47:00.082512 master-0 kubenswrapper[13984]: I0312 12:47:00.079267 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:00.090337 master-0 kubenswrapper[13984]: I0312 12:47:00.082993 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:00.090337 master-0 kubenswrapper[13984]: I0312 12:47:00.085393 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.090337 master-0 kubenswrapper[13984]: I0312 12:47:00.089333 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-api-config-data" Mar 12 12:47:00.099142 master-0 kubenswrapper[13984]: I0312 12:47:00.097574 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:00.112083 master-0 kubenswrapper[13984]: I0312 12:47:00.112014 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data-custom\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112180 master-0 kubenswrapper[13984]: I0312 12:47:00.112123 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-nvme\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112180 master-0 kubenswrapper[13984]: I0312 12:47:00.112152 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112321 master-0 kubenswrapper[13984]: I0312 12:47:00.112178 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6ghn\" (UniqueName: \"kubernetes.io/projected/d3192133-638b-48ca-83d2-a98db6495759-kube-api-access-t6ghn\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.112321 master-0 kubenswrapper[13984]: I0312 12:47:00.112242 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-scripts\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.112321 master-0 kubenswrapper[13984]: I0312 12:47:00.112301 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-lib-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112321 master-0 kubenswrapper[13984]: I0312 12:47:00.112319 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-combined-ca-bundle\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.112445 master-0 kubenswrapper[13984]: I0312 12:47:00.112340 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srvfm\" (UniqueName: \"kubernetes.io/projected/d1ce46a3-0909-4d15-b166-fa480a7e6164-kube-api-access-srvfm\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112445 master-0 kubenswrapper[13984]: I0312 12:47:00.112357 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-dev\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112445 master-0 kubenswrapper[13984]: I0312 12:47:00.112372 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.112445 master-0 kubenswrapper[13984]: I0312 12:47:00.112390 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-swift-storage-0\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.112445 master-0 kubenswrapper[13984]: I0312 12:47:00.112408 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-lib-modules\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112445 master-0 kubenswrapper[13984]: I0312 12:47:00.112423 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-combined-ca-bundle\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112640 master-0 kubenswrapper[13984]: I0312 12:47:00.112448 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-sb\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.112640 master-0 kubenswrapper[13984]: I0312 12:47:00.112497 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-iscsi\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112640 master-0 kubenswrapper[13984]: I0312 12:47:00.112518 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-config\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.112640 master-0 kubenswrapper[13984]: I0312 12:47:00.112536 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-sys\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112640 master-0 kubenswrapper[13984]: I0312 12:47:00.112559 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data-custom\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.112778 master-0 kubenswrapper[13984]: I0312 12:47:00.112673 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-scripts\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.112861 master-0 kubenswrapper[13984]: I0312 12:47:00.112810 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3192133-638b-48ca-83d2-a98db6495759-logs\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.112928 master-0 kubenswrapper[13984]: I0312 12:47:00.112861 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-run\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.113133 master-0 kubenswrapper[13984]: I0312 12:47:00.113051 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpsqd\" (UniqueName: \"kubernetes.io/projected/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-kube-api-access-qpsqd\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.113133 master-0 kubenswrapper[13984]: I0312 12:47:00.113110 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.113215 master-0 kubenswrapper[13984]: I0312 12:47:00.113188 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-svc\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.113501 master-0 kubenswrapper[13984]: I0312 12:47:00.113227 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-brick\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.113501 master-0 kubenswrapper[13984]: I0312 12:47:00.113316 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3192133-638b-48ca-83d2-a98db6495759-etc-machine-id\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.113501 master-0 kubenswrapper[13984]: I0312 12:47:00.113376 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-nb\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.113501 master-0 kubenswrapper[13984]: I0312 12:47:00.113409 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-machine-id\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.206715 master-0 kubenswrapper[13984]: I0312 12:47:00.206654 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerName="dnsmasq-dns" containerID="cri-o://eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6" gracePeriod=10 Mar 12 12:47:00.207191 master-0 kubenswrapper[13984]: I0312 12:47:00.206990 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc97885fc-g6g5j" event={"ID":"f399c312-60ae-457b-b49e-481c112547d1","Type":"ContainerStarted","Data":"3b9ef4e60b671a8e34c931675c33392811581e2ea72a2f77446d2e426f43bfe5"} Mar 12 12:47:00.207191 master-0 kubenswrapper[13984]: I0312 12:47:00.207025 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc97885fc-g6g5j" event={"ID":"f399c312-60ae-457b-b49e-481c112547d1","Type":"ContainerStarted","Data":"96c0162587e954a012d1dc3cf6c59f1692c4841f4206fdcb3b399a6067682e79"} Mar 12 12:47:00.218027 master-0 kubenswrapper[13984]: I0312 12:47:00.217969 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-iscsi\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218120 master-0 kubenswrapper[13984]: I0312 12:47:00.218034 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-config\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.218120 master-0 kubenswrapper[13984]: I0312 12:47:00.218065 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-sys\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218120 master-0 kubenswrapper[13984]: I0312 12:47:00.218085 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-scripts\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218120 master-0 kubenswrapper[13984]: I0312 12:47:00.218103 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data-custom\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218278 master-0 kubenswrapper[13984]: I0312 12:47:00.218125 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3192133-638b-48ca-83d2-a98db6495759-logs\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218278 master-0 kubenswrapper[13984]: I0312 12:47:00.218139 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-run\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218278 master-0 kubenswrapper[13984]: I0312 12:47:00.218198 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpsqd\" (UniqueName: \"kubernetes.io/projected/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-kube-api-access-qpsqd\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.218278 master-0 kubenswrapper[13984]: I0312 12:47:00.218227 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218278 master-0 kubenswrapper[13984]: I0312 12:47:00.218247 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-svc\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.218278 master-0 kubenswrapper[13984]: I0312 12:47:00.218265 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-brick\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218282 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3192133-638b-48ca-83d2-a98db6495759-etc-machine-id\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218322 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-nb\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218346 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-machine-id\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218371 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data-custom\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218411 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-nvme\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218434 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218461 master-0 kubenswrapper[13984]: I0312 12:47:00.218459 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6ghn\" (UniqueName: \"kubernetes.io/projected/d3192133-638b-48ca-83d2-a98db6495759-kube-api-access-t6ghn\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218492 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-scripts\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218515 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-lib-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218532 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-combined-ca-bundle\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218554 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srvfm\" (UniqueName: \"kubernetes.io/projected/d1ce46a3-0909-4d15-b166-fa480a7e6164-kube-api-access-srvfm\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218571 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218587 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-dev\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218605 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-swift-storage-0\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218620 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-lib-modules\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218638 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-combined-ca-bundle\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.218691 master-0 kubenswrapper[13984]: I0312 12:47:00.218662 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-sb\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.219617 master-0 kubenswrapper[13984]: I0312 12:47:00.219130 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-machine-id\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.223706 master-0 kubenswrapper[13984]: I0312 12:47:00.223652 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-iscsi\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.225208 master-0 kubenswrapper[13984]: I0312 12:47:00.225131 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-config\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.225208 master-0 kubenswrapper[13984]: I0312 12:47:00.225167 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-sys\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.227866 master-0 kubenswrapper[13984]: I0312 12:47:00.227819 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-run\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.228197 master-0 kubenswrapper[13984]: I0312 12:47:00.228171 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3192133-638b-48ca-83d2-a98db6495759-logs\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.228239 master-0 kubenswrapper[13984]: I0312 12:47:00.228226 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3192133-638b-48ca-83d2-a98db6495759-etc-machine-id\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.228271 master-0 kubenswrapper[13984]: I0312 12:47:00.228258 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-brick\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.228562 master-0 kubenswrapper[13984]: I0312 12:47:00.228523 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-svc\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.231172 master-0 kubenswrapper[13984]: I0312 12:47:00.230935 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-nvme\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.235612 master-0 kubenswrapper[13984]: I0312 12:47:00.234834 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-dev\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.235612 master-0 kubenswrapper[13984]: I0312 12:47:00.234925 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-lib-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.235612 master-0 kubenswrapper[13984]: I0312 12:47:00.234952 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.235612 master-0 kubenswrapper[13984]: I0312 12:47:00.234998 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-lib-modules\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.235612 master-0 kubenswrapper[13984]: I0312 12:47:00.235538 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-swift-storage-0\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.235819 master-0 kubenswrapper[13984]: I0312 12:47:00.235745 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-sb\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.235819 master-0 kubenswrapper[13984]: I0312 12:47:00.235803 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-nb\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.247431 master-0 kubenswrapper[13984]: I0312 12:47:00.247256 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data-custom\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.247431 master-0 kubenswrapper[13984]: I0312 12:47:00.247342 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data-custom\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.247431 master-0 kubenswrapper[13984]: I0312 12:47:00.247353 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-scripts\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.247431 master-0 kubenswrapper[13984]: I0312 12:47:00.247376 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-combined-ca-bundle\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.247820 master-0 kubenswrapper[13984]: I0312 12:47:00.247779 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.248198 master-0 kubenswrapper[13984]: I0312 12:47:00.248167 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.251057 master-0 kubenswrapper[13984]: I0312 12:47:00.251018 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-scripts\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.252733 master-0 kubenswrapper[13984]: I0312 12:47:00.252609 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-combined-ca-bundle\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.331666 master-0 kubenswrapper[13984]: I0312 12:47:00.318194 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6ghn\" (UniqueName: \"kubernetes.io/projected/d3192133-638b-48ca-83d2-a98db6495759-kube-api-access-t6ghn\") pod \"cinder-8c9c7-api-0\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.340505 master-0 kubenswrapper[13984]: I0312 12:47:00.332991 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpsqd\" (UniqueName: \"kubernetes.io/projected/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-kube-api-access-qpsqd\") pod \"dnsmasq-dns-649858dd65-skgqw\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.340505 master-0 kubenswrapper[13984]: I0312 12:47:00.335767 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srvfm\" (UniqueName: \"kubernetes.io/projected/d1ce46a3-0909-4d15-b166-fa480a7e6164-kube-api-access-srvfm\") pod \"cinder-8c9c7-backup-0\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.351508 master-0 kubenswrapper[13984]: I0312 12:47:00.349995 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:00.375501 master-0 kubenswrapper[13984]: I0312 12:47:00.373447 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:00.424605 master-0 kubenswrapper[13984]: I0312 12:47:00.421930 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:00.609747 master-0 kubenswrapper[13984]: I0312 12:47:00.609315 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:01.007207 master-0 kubenswrapper[13984]: I0312 12:47:01.007132 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:01.206143 master-0 kubenswrapper[13984]: I0312 12:47:01.203060 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:47:01.224505 master-0 kubenswrapper[13984]: I0312 12:47:01.221512 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-649858dd65-skgqw"] Mar 12 12:47:01.240703 master-0 kubenswrapper[13984]: I0312 12:47:01.232537 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"634c7359-7f98-4b5c-b01b-ace6fd3fcf34","Type":"ContainerStarted","Data":"226ba7589e7220629d1966ba084c24569efd6699287db9487cbdefdcacf723ff"} Mar 12 12:47:01.255209 master-0 kubenswrapper[13984]: I0312 12:47:01.255130 13984 generic.go:334] "Generic (PLEG): container finished" podID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerID="eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6" exitCode=0 Mar 12 12:47:01.255428 master-0 kubenswrapper[13984]: I0312 12:47:01.255246 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" event={"ID":"4ea6d788-25be-45ad-8bb5-b67fa78befb5","Type":"ContainerDied","Data":"eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6"} Mar 12 12:47:01.255428 master-0 kubenswrapper[13984]: I0312 12:47:01.255284 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" event={"ID":"4ea6d788-25be-45ad-8bb5-b67fa78befb5","Type":"ContainerDied","Data":"bb0f2425bfa46cee451876593032e5fcda566bf4da64a08cf8b39bd1beeb8a91"} Mar 12 12:47:01.255428 master-0 kubenswrapper[13984]: I0312 12:47:01.255292 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bd64f6f77-8vlbv" Mar 12 12:47:01.255428 master-0 kubenswrapper[13984]: I0312 12:47:01.255305 13984 scope.go:117] "RemoveContainer" containerID="eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6" Mar 12 12:47:01.295268 master-0 kubenswrapper[13984]: I0312 12:47:01.295196 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6dc97885fc-g6g5j" event={"ID":"f399c312-60ae-457b-b49e-481c112547d1","Type":"ContainerStarted","Data":"b64d88d5dc50eb135e5d90f015f3c2e72dbe41ae897157672a15e96fff9dada9"} Mar 12 12:47:01.295460 master-0 kubenswrapper[13984]: I0312 12:47:01.295301 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:47:01.342686 master-0 kubenswrapper[13984]: I0312 12:47:01.342553 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"13c0e883-944a-444f-ad81-f88bb8dec00e","Type":"ContainerStarted","Data":"b3b9296fce29362ac2c5a49ef3005c1b05cd9732d62ce37aed068a5955a4140a"} Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.359819 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-svc\") pod \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.359962 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-config\") pod \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.360007 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-nb\") pod \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.360045 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhllj\" (UniqueName: \"kubernetes.io/projected/4ea6d788-25be-45ad-8bb5-b67fa78befb5-kube-api-access-lhllj\") pod \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.360081 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-sb\") pod \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.360227 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-swift-storage-0\") pod \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\" (UID: \"4ea6d788-25be-45ad-8bb5-b67fa78befb5\") " Mar 12 12:47:01.365999 master-0 kubenswrapper[13984]: I0312 12:47:01.363561 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea6d788-25be-45ad-8bb5-b67fa78befb5-kube-api-access-lhllj" (OuterVolumeSpecName: "kube-api-access-lhllj") pod "4ea6d788-25be-45ad-8bb5-b67fa78befb5" (UID: "4ea6d788-25be-45ad-8bb5-b67fa78befb5"). InnerVolumeSpecName "kube-api-access-lhllj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:01.366609 master-0 kubenswrapper[13984]: I0312 12:47:01.366502 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:01.373392 master-0 kubenswrapper[13984]: I0312 12:47:01.372639 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6dc97885fc-g6g5j" podStartSLOduration=3.372622018 podStartE2EDuration="3.372622018s" podCreationTimestamp="2026-03-12 12:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:01.33894738 +0000 UTC m=+1353.536962872" watchObservedRunningTime="2026-03-12 12:47:01.372622018 +0000 UTC m=+1353.570637500" Mar 12 12:47:01.424914 master-0 kubenswrapper[13984]: I0312 12:47:01.424824 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:01.475377 master-0 kubenswrapper[13984]: I0312 12:47:01.475324 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhllj\" (UniqueName: \"kubernetes.io/projected/4ea6d788-25be-45ad-8bb5-b67fa78befb5-kube-api-access-lhllj\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:01.538723 master-0 kubenswrapper[13984]: I0312 12:47:01.538211 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-config" (OuterVolumeSpecName: "config") pod "4ea6d788-25be-45ad-8bb5-b67fa78befb5" (UID: "4ea6d788-25be-45ad-8bb5-b67fa78befb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:01.546434 master-0 kubenswrapper[13984]: I0312 12:47:01.546026 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ea6d788-25be-45ad-8bb5-b67fa78befb5" (UID: "4ea6d788-25be-45ad-8bb5-b67fa78befb5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:01.564432 master-0 kubenswrapper[13984]: I0312 12:47:01.564340 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ea6d788-25be-45ad-8bb5-b67fa78befb5" (UID: "4ea6d788-25be-45ad-8bb5-b67fa78befb5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:01.568901 master-0 kubenswrapper[13984]: I0312 12:47:01.568676 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ea6d788-25be-45ad-8bb5-b67fa78befb5" (UID: "4ea6d788-25be-45ad-8bb5-b67fa78befb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:01.569422 master-0 kubenswrapper[13984]: I0312 12:47:01.569364 13984 scope.go:117] "RemoveContainer" containerID="2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a" Mar 12 12:47:01.575934 master-0 kubenswrapper[13984]: I0312 12:47:01.575885 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4ea6d788-25be-45ad-8bb5-b67fa78befb5" (UID: "4ea6d788-25be-45ad-8bb5-b67fa78befb5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:01.579454 master-0 kubenswrapper[13984]: I0312 12:47:01.577459 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:01.579454 master-0 kubenswrapper[13984]: I0312 12:47:01.578866 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:01.579454 master-0 kubenswrapper[13984]: I0312 12:47:01.578887 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:01.579454 master-0 kubenswrapper[13984]: I0312 12:47:01.578899 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:01.579454 master-0 kubenswrapper[13984]: I0312 12:47:01.578913 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ea6d788-25be-45ad-8bb5-b67fa78befb5-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:01.633183 master-0 kubenswrapper[13984]: I0312 12:47:01.633129 13984 scope.go:117] "RemoveContainer" containerID="eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6" Mar 12 12:47:01.633952 master-0 kubenswrapper[13984]: E0312 12:47:01.633728 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6\": container with ID starting with eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6 not found: ID does not exist" containerID="eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6" Mar 12 12:47:01.633952 master-0 kubenswrapper[13984]: I0312 12:47:01.633786 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6"} err="failed to get container status \"eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6\": rpc error: code = NotFound desc = could not find container \"eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6\": container with ID starting with eb59c104faaae785210083ac180c7b245540e6ca79f293baafabbd24844b46a6 not found: ID does not exist" Mar 12 12:47:01.633952 master-0 kubenswrapper[13984]: I0312 12:47:01.633820 13984 scope.go:117] "RemoveContainer" containerID="2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a" Mar 12 12:47:01.634388 master-0 kubenswrapper[13984]: E0312 12:47:01.634329 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a\": container with ID starting with 2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a not found: ID does not exist" containerID="2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a" Mar 12 12:47:01.634388 master-0 kubenswrapper[13984]: I0312 12:47:01.634364 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a"} err="failed to get container status \"2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a\": rpc error: code = NotFound desc = could not find container \"2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a\": container with ID starting with 2aa38c1e223c2d72a52f774cbcb5631f112dda486c1f206c3f2879f2f211ab3a not found: ID does not exist" Mar 12 12:47:01.903342 master-0 kubenswrapper[13984]: I0312 12:47:01.900669 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bd64f6f77-8vlbv"] Mar 12 12:47:01.920640 master-0 kubenswrapper[13984]: I0312 12:47:01.920601 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bd64f6f77-8vlbv"] Mar 12 12:47:02.003253 master-0 kubenswrapper[13984]: I0312 12:47:02.003132 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" path="/var/lib/kubelet/pods/4ea6d788-25be-45ad-8bb5-b67fa78befb5/volumes" Mar 12 12:47:02.378358 master-0 kubenswrapper[13984]: I0312 12:47:02.377077 13984 generic.go:334] "Generic (PLEG): container finished" podID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerID="2c117435077754597d1f54b4e4047dd1d45a80154a61615dc9286f99ef0294ec" exitCode=0 Mar 12 12:47:02.378358 master-0 kubenswrapper[13984]: I0312 12:47:02.377140 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649858dd65-skgqw" event={"ID":"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09","Type":"ContainerDied","Data":"2c117435077754597d1f54b4e4047dd1d45a80154a61615dc9286f99ef0294ec"} Mar 12 12:47:02.378358 master-0 kubenswrapper[13984]: I0312 12:47:02.377166 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649858dd65-skgqw" event={"ID":"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09","Type":"ContainerStarted","Data":"9a778661502fd4144e742c88cd48ac41d9beaab32305c78ac59f4f378f4991bb"} Mar 12 12:47:02.419799 master-0 kubenswrapper[13984]: I0312 12:47:02.387308 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"d3192133-638b-48ca-83d2-a98db6495759","Type":"ContainerStarted","Data":"0889e15aa37cd14d18ae98025e49ccca61a299a58f2fc48ce58222d6586d0ad3"} Mar 12 12:47:02.419799 master-0 kubenswrapper[13984]: I0312 12:47:02.398404 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"d1ce46a3-0909-4d15-b166-fa480a7e6164","Type":"ContainerStarted","Data":"598407bbaa32270d50c3f82a3df29944ef2f3e45c11a3a169a35914270f9f2cf"} Mar 12 12:47:02.464068 master-0 kubenswrapper[13984]: I0312 12:47:02.464013 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:03.439330 master-0 kubenswrapper[13984]: I0312 12:47:03.439271 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649858dd65-skgqw" event={"ID":"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09","Type":"ContainerStarted","Data":"0129114a5a66baabd536135c9b5018f429623b02f0db9f17dc905df1e379b1f6"} Mar 12 12:47:03.440101 master-0 kubenswrapper[13984]: I0312 12:47:03.439417 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:03.455746 master-0 kubenswrapper[13984]: I0312 12:47:03.455688 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"634c7359-7f98-4b5c-b01b-ace6fd3fcf34","Type":"ContainerStarted","Data":"6fccd0c52541641099f33277c03b55357011f80f52c7f0c1ed849de024caef7a"} Mar 12 12:47:03.455746 master-0 kubenswrapper[13984]: I0312 12:47:03.455745 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"634c7359-7f98-4b5c-b01b-ace6fd3fcf34","Type":"ContainerStarted","Data":"8e7c9395476f01caae167cd43716272e9e5128f931102612c91124b2bdccc17f"} Mar 12 12:47:03.461666 master-0 kubenswrapper[13984]: I0312 12:47:03.461526 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"d3192133-638b-48ca-83d2-a98db6495759","Type":"ContainerStarted","Data":"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090"} Mar 12 12:47:03.474090 master-0 kubenswrapper[13984]: I0312 12:47:03.474024 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-649858dd65-skgqw" podStartSLOduration=4.47400251 podStartE2EDuration="4.47400251s" podCreationTimestamp="2026-03-12 12:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:03.464431365 +0000 UTC m=+1355.662446857" watchObservedRunningTime="2026-03-12 12:47:03.47400251 +0000 UTC m=+1355.672018002" Mar 12 12:47:03.479761 master-0 kubenswrapper[13984]: I0312 12:47:03.479108 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"d1ce46a3-0909-4d15-b166-fa480a7e6164","Type":"ContainerStarted","Data":"eeb54250e63c3840e474dc512537984d482244e8df6768125caa1d48b8cd85a3"} Mar 12 12:47:03.496194 master-0 kubenswrapper[13984]: I0312 12:47:03.496139 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"13c0e883-944a-444f-ad81-f88bb8dec00e","Type":"ContainerStarted","Data":"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11"} Mar 12 12:47:03.514054 master-0 kubenswrapper[13984]: I0312 12:47:03.510417 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" podStartSLOduration=3.555083229 podStartE2EDuration="4.510399082s" podCreationTimestamp="2026-03-12 12:46:59 +0000 UTC" firstStartedPulling="2026-03-12 12:47:01.041321973 +0000 UTC m=+1353.239337465" lastFinishedPulling="2026-03-12 12:47:01.996637826 +0000 UTC m=+1354.194653318" observedRunningTime="2026-03-12 12:47:03.505553708 +0000 UTC m=+1355.703569230" watchObservedRunningTime="2026-03-12 12:47:03.510399082 +0000 UTC m=+1355.708414574" Mar 12 12:47:04.508336 master-0 kubenswrapper[13984]: I0312 12:47:04.508276 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"13c0e883-944a-444f-ad81-f88bb8dec00e","Type":"ContainerStarted","Data":"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411"} Mar 12 12:47:04.512432 master-0 kubenswrapper[13984]: I0312 12:47:04.512399 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"d3192133-638b-48ca-83d2-a98db6495759","Type":"ContainerStarted","Data":"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b"} Mar 12 12:47:04.512588 master-0 kubenswrapper[13984]: I0312 12:47:04.512516 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-api-0" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-8c9c7-api-log" containerID="cri-o://5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090" gracePeriod=30 Mar 12 12:47:04.512725 master-0 kubenswrapper[13984]: I0312 12:47:04.512704 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:04.512810 master-0 kubenswrapper[13984]: I0312 12:47:04.512760 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-api-0" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-api" containerID="cri-o://40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b" gracePeriod=30 Mar 12 12:47:04.537498 master-0 kubenswrapper[13984]: I0312 12:47:04.535404 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"d1ce46a3-0909-4d15-b166-fa480a7e6164","Type":"ContainerStarted","Data":"e90f7d48875325837b809728e5f30a92e9a080085dac7bd5acce43c5d815fc5e"} Mar 12 12:47:04.551005 master-0 kubenswrapper[13984]: I0312 12:47:04.550924 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-scheduler-0" podStartSLOduration=4.534144207 podStartE2EDuration="5.550904519s" podCreationTimestamp="2026-03-12 12:46:59 +0000 UTC" firstStartedPulling="2026-03-12 12:47:00.642116177 +0000 UTC m=+1352.840131659" lastFinishedPulling="2026-03-12 12:47:01.658876479 +0000 UTC m=+1353.856891971" observedRunningTime="2026-03-12 12:47:04.549774553 +0000 UTC m=+1356.747790045" watchObservedRunningTime="2026-03-12 12:47:04.550904519 +0000 UTC m=+1356.748920011" Mar 12 12:47:04.634500 master-0 kubenswrapper[13984]: I0312 12:47:04.630825 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-backup-0" podStartSLOduration=4.603511372 podStartE2EDuration="5.63079622s" podCreationTimestamp="2026-03-12 12:46:59 +0000 UTC" firstStartedPulling="2026-03-12 12:47:01.586590987 +0000 UTC m=+1353.784606479" lastFinishedPulling="2026-03-12 12:47:02.613875835 +0000 UTC m=+1354.811891327" observedRunningTime="2026-03-12 12:47:04.610242068 +0000 UTC m=+1356.808257580" watchObservedRunningTime="2026-03-12 12:47:04.63079622 +0000 UTC m=+1356.828811702" Mar 12 12:47:04.666893 master-0 kubenswrapper[13984]: I0312 12:47:04.663292 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-api-0" podStartSLOduration=5.663258499 podStartE2EDuration="5.663258499s" podCreationTimestamp="2026-03-12 12:46:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:04.654923074 +0000 UTC m=+1356.852938576" watchObservedRunningTime="2026-03-12 12:47:04.663258499 +0000 UTC m=+1356.861273981" Mar 12 12:47:04.952993 master-0 kubenswrapper[13984]: I0312 12:47:04.938704 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:05.083581 master-0 kubenswrapper[13984]: I0312 12:47:05.081596 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:05.355511 master-0 kubenswrapper[13984]: I0312 12:47:05.352313 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:05.402503 master-0 kubenswrapper[13984]: I0312 12:47:05.401863 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499380 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6ghn\" (UniqueName: \"kubernetes.io/projected/d3192133-638b-48ca-83d2-a98db6495759-kube-api-access-t6ghn\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499441 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3192133-638b-48ca-83d2-a98db6495759-etc-machine-id\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499661 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-combined-ca-bundle\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499758 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-scripts\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499798 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499853 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3192133-638b-48ca-83d2-a98db6495759-logs\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.501554 master-0 kubenswrapper[13984]: I0312 12:47:05.499903 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data-custom\") pod \"d3192133-638b-48ca-83d2-a98db6495759\" (UID: \"d3192133-638b-48ca-83d2-a98db6495759\") " Mar 12 12:47:05.503602 master-0 kubenswrapper[13984]: I0312 12:47:05.503550 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d3192133-638b-48ca-83d2-a98db6495759-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:05.503960 master-0 kubenswrapper[13984]: I0312 12:47:05.503924 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3192133-638b-48ca-83d2-a98db6495759-logs" (OuterVolumeSpecName: "logs") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:05.505265 master-0 kubenswrapper[13984]: I0312 12:47:05.505220 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:05.509284 master-0 kubenswrapper[13984]: I0312 12:47:05.509232 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-scripts" (OuterVolumeSpecName: "scripts") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:05.517351 master-0 kubenswrapper[13984]: I0312 12:47:05.517287 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3192133-638b-48ca-83d2-a98db6495759-kube-api-access-t6ghn" (OuterVolumeSpecName: "kube-api-access-t6ghn") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "kube-api-access-t6ghn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:05.541955 master-0 kubenswrapper[13984]: I0312 12:47:05.541655 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:05.579071 master-0 kubenswrapper[13984]: I0312 12:47:05.578393 13984 generic.go:334] "Generic (PLEG): container finished" podID="d3192133-638b-48ca-83d2-a98db6495759" containerID="40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b" exitCode=0 Mar 12 12:47:05.579071 master-0 kubenswrapper[13984]: I0312 12:47:05.578438 13984 generic.go:334] "Generic (PLEG): container finished" podID="d3192133-638b-48ca-83d2-a98db6495759" containerID="5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090" exitCode=143 Mar 12 12:47:05.579312 master-0 kubenswrapper[13984]: I0312 12:47:05.579192 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:05.579538 master-0 kubenswrapper[13984]: I0312 12:47:05.579463 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"d3192133-638b-48ca-83d2-a98db6495759","Type":"ContainerDied","Data":"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b"} Mar 12 12:47:05.579538 master-0 kubenswrapper[13984]: I0312 12:47:05.579514 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"d3192133-638b-48ca-83d2-a98db6495759","Type":"ContainerDied","Data":"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090"} Mar 12 12:47:05.579538 master-0 kubenswrapper[13984]: I0312 12:47:05.579527 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"d3192133-638b-48ca-83d2-a98db6495759","Type":"ContainerDied","Data":"0889e15aa37cd14d18ae98025e49ccca61a299a58f2fc48ce58222d6586d0ad3"} Mar 12 12:47:05.579663 master-0 kubenswrapper[13984]: I0312 12:47:05.579545 13984 scope.go:117] "RemoveContainer" containerID="40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b" Mar 12 12:47:05.586096 master-0 kubenswrapper[13984]: I0312 12:47:05.585861 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data" (OuterVolumeSpecName: "config-data") pod "d3192133-638b-48ca-83d2-a98db6495759" (UID: "d3192133-638b-48ca-83d2-a98db6495759"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603330 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6ghn\" (UniqueName: \"kubernetes.io/projected/d3192133-638b-48ca-83d2-a98db6495759-kube-api-access-t6ghn\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603363 13984 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d3192133-638b-48ca-83d2-a98db6495759-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603373 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603381 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603391 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603399 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d3192133-638b-48ca-83d2-a98db6495759-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.604342 master-0 kubenswrapper[13984]: I0312 12:47:05.603408 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d3192133-638b-48ca-83d2-a98db6495759-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:05.619611 master-0 kubenswrapper[13984]: I0312 12:47:05.618747 13984 scope.go:117] "RemoveContainer" containerID="5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090" Mar 12 12:47:05.653519 master-0 kubenswrapper[13984]: I0312 12:47:05.653280 13984 scope.go:117] "RemoveContainer" containerID="40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b" Mar 12 12:47:05.654064 master-0 kubenswrapper[13984]: E0312 12:47:05.653732 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b\": container with ID starting with 40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b not found: ID does not exist" containerID="40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b" Mar 12 12:47:05.654064 master-0 kubenswrapper[13984]: I0312 12:47:05.653776 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b"} err="failed to get container status \"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b\": rpc error: code = NotFound desc = could not find container \"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b\": container with ID starting with 40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b not found: ID does not exist" Mar 12 12:47:05.654064 master-0 kubenswrapper[13984]: I0312 12:47:05.653806 13984 scope.go:117] "RemoveContainer" containerID="5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090" Mar 12 12:47:05.654250 master-0 kubenswrapper[13984]: E0312 12:47:05.654113 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090\": container with ID starting with 5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090 not found: ID does not exist" containerID="5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090" Mar 12 12:47:05.654250 master-0 kubenswrapper[13984]: I0312 12:47:05.654139 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090"} err="failed to get container status \"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090\": rpc error: code = NotFound desc = could not find container \"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090\": container with ID starting with 5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090 not found: ID does not exist" Mar 12 12:47:05.654250 master-0 kubenswrapper[13984]: I0312 12:47:05.654159 13984 scope.go:117] "RemoveContainer" containerID="40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b" Mar 12 12:47:05.654594 master-0 kubenswrapper[13984]: I0312 12:47:05.654387 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b"} err="failed to get container status \"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b\": rpc error: code = NotFound desc = could not find container \"40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b\": container with ID starting with 40964e93a23161addfe3bc8271935d761a47d0d50da5b03573bc27abe667c03b not found: ID does not exist" Mar 12 12:47:05.654594 master-0 kubenswrapper[13984]: I0312 12:47:05.654412 13984 scope.go:117] "RemoveContainer" containerID="5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090" Mar 12 12:47:05.654728 master-0 kubenswrapper[13984]: I0312 12:47:05.654699 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090"} err="failed to get container status \"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090\": rpc error: code = NotFound desc = could not find container \"5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090\": container with ID starting with 5f581636b4cebf31c9794eea104369240dd14e6675b13331242233c580ef3090 not found: ID does not exist" Mar 12 12:47:05.926586 master-0 kubenswrapper[13984]: I0312 12:47:05.926366 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:05.939357 master-0 kubenswrapper[13984]: I0312 12:47:05.939295 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:05.953202 master-0 kubenswrapper[13984]: I0312 12:47:05.953142 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:05.953696 master-0 kubenswrapper[13984]: E0312 12:47:05.953671 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-api" Mar 12 12:47:05.953696 master-0 kubenswrapper[13984]: I0312 12:47:05.953691 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-api" Mar 12 12:47:05.953796 master-0 kubenswrapper[13984]: E0312 12:47:05.953712 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-8c9c7-api-log" Mar 12 12:47:05.953796 master-0 kubenswrapper[13984]: I0312 12:47:05.953719 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-8c9c7-api-log" Mar 12 12:47:05.953796 master-0 kubenswrapper[13984]: E0312 12:47:05.953733 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerName="init" Mar 12 12:47:05.953796 master-0 kubenswrapper[13984]: I0312 12:47:05.953740 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerName="init" Mar 12 12:47:05.953796 master-0 kubenswrapper[13984]: E0312 12:47:05.953756 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerName="dnsmasq-dns" Mar 12 12:47:05.953796 master-0 kubenswrapper[13984]: I0312 12:47:05.953762 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerName="dnsmasq-dns" Mar 12 12:47:05.953983 master-0 kubenswrapper[13984]: I0312 12:47:05.953961 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-8c9c7-api-log" Mar 12 12:47:05.953983 master-0 kubenswrapper[13984]: I0312 12:47:05.953980 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3192133-638b-48ca-83d2-a98db6495759" containerName="cinder-api" Mar 12 12:47:05.954048 master-0 kubenswrapper[13984]: I0312 12:47:05.954001 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea6d788-25be-45ad-8bb5-b67fa78befb5" containerName="dnsmasq-dns" Mar 12 12:47:05.955137 master-0 kubenswrapper[13984]: I0312 12:47:05.955105 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:05.957105 master-0 kubenswrapper[13984]: I0312 12:47:05.957072 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-api-config-data" Mar 12 12:47:05.957211 master-0 kubenswrapper[13984]: I0312 12:47:05.957183 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 12 12:47:05.957268 master-0 kubenswrapper[13984]: I0312 12:47:05.957251 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 12 12:47:06.086527 master-0 kubenswrapper[13984]: I0312 12:47:06.086370 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3192133-638b-48ca-83d2-a98db6495759" path="/var/lib/kubelet/pods/d3192133-638b-48ca-83d2-a98db6495759/volumes" Mar 12 12:47:06.087392 master-0 kubenswrapper[13984]: I0312 12:47:06.087350 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:06.132161 master-0 kubenswrapper[13984]: I0312 12:47:06.132115 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg8nr\" (UniqueName: \"kubernetes.io/projected/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-kube-api-access-cg8nr\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.132759 master-0 kubenswrapper[13984]: I0312 12:47:06.132709 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-scripts\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.132843 master-0 kubenswrapper[13984]: I0312 12:47:06.132814 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-etc-machine-id\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.132902 master-0 kubenswrapper[13984]: I0312 12:47:06.132880 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-config-data\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.133080 master-0 kubenswrapper[13984]: I0312 12:47:06.133028 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-logs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.133490 master-0 kubenswrapper[13984]: I0312 12:47:06.133439 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-config-data-custom\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.133780 master-0 kubenswrapper[13984]: I0312 12:47:06.133748 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-public-tls-certs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.134109 master-0 kubenswrapper[13984]: I0312 12:47:06.133809 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-internal-tls-certs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.134230 master-0 kubenswrapper[13984]: I0312 12:47:06.134194 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-combined-ca-bundle\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.236707 master-0 kubenswrapper[13984]: I0312 12:47:06.236612 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-internal-tls-certs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.236950 master-0 kubenswrapper[13984]: I0312 12:47:06.236793 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-combined-ca-bundle\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.236950 master-0 kubenswrapper[13984]: I0312 12:47:06.236861 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg8nr\" (UniqueName: \"kubernetes.io/projected/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-kube-api-access-cg8nr\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239092 master-0 kubenswrapper[13984]: I0312 12:47:06.239018 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-scripts\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239305 master-0 kubenswrapper[13984]: I0312 12:47:06.239166 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-etc-machine-id\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239423 master-0 kubenswrapper[13984]: I0312 12:47:06.239316 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-etc-machine-id\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239423 master-0 kubenswrapper[13984]: I0312 12:47:06.239384 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-config-data\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239713 master-0 kubenswrapper[13984]: I0312 12:47:06.239537 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-logs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239713 master-0 kubenswrapper[13984]: I0312 12:47:06.239687 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-config-data-custom\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.239935 master-0 kubenswrapper[13984]: I0312 12:47:06.239778 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-public-tls-certs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.240043 master-0 kubenswrapper[13984]: I0312 12:47:06.239929 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-logs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.241268 master-0 kubenswrapper[13984]: I0312 12:47:06.241045 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-combined-ca-bundle\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.242499 master-0 kubenswrapper[13984]: I0312 12:47:06.242103 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-internal-tls-certs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.242891 master-0 kubenswrapper[13984]: I0312 12:47:06.242835 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-config-data-custom\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.244251 master-0 kubenswrapper[13984]: I0312 12:47:06.244192 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-public-tls-certs\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.244724 master-0 kubenswrapper[13984]: I0312 12:47:06.244661 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-scripts\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.246428 master-0 kubenswrapper[13984]: I0312 12:47:06.245886 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-config-data\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.258683 master-0 kubenswrapper[13984]: I0312 12:47:06.258586 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg8nr\" (UniqueName: \"kubernetes.io/projected/5fcc18ae-60e7-48a2-b6ef-9c6d98f38757-kube-api-access-cg8nr\") pod \"cinder-8c9c7-api-0\" (UID: \"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757\") " pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.425072 master-0 kubenswrapper[13984]: I0312 12:47:06.424955 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:06.596185 master-0 kubenswrapper[13984]: I0312 12:47:06.596137 13984 generic.go:334] "Generic (PLEG): container finished" podID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerID="090315b728384dca4a5ff387b80c6d14f3812cb087d7eb1f6b37c177b3518981" exitCode=0 Mar 12 12:47:06.596706 master-0 kubenswrapper[13984]: I0312 12:47:06.596215 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-ftw7v" event={"ID":"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3","Type":"ContainerDied","Data":"090315b728384dca4a5ff387b80c6d14f3812cb087d7eb1f6b37c177b3518981"} Mar 12 12:47:06.931918 master-0 kubenswrapper[13984]: I0312 12:47:06.931858 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-api-0"] Mar 12 12:47:07.617642 master-0 kubenswrapper[13984]: I0312 12:47:07.617577 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757","Type":"ContainerStarted","Data":"7dce8d5c652ac7f4c1153cc6a4a2888e577a50ab904ccf93cab6167a13d3e9fe"} Mar 12 12:47:07.617642 master-0 kubenswrapper[13984]: I0312 12:47:07.617637 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757","Type":"ContainerStarted","Data":"cc4a9855cc1c5b85c64535e31dddeaa96ce74b30180eb8d5537f0c993f0b0c32"} Mar 12 12:47:08.162675 master-0 kubenswrapper[13984]: I0312 12:47:08.162627 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:47:08.213637 master-0 kubenswrapper[13984]: I0312 12:47:08.213594 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-combined-ca-bundle\") pod \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " Mar 12 12:47:08.213923 master-0 kubenswrapper[13984]: I0312 12:47:08.213902 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-scripts\") pod \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " Mar 12 12:47:08.214033 master-0 kubenswrapper[13984]: I0312 12:47:08.214017 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-etc-podinfo\") pod \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " Mar 12 12:47:08.214186 master-0 kubenswrapper[13984]: I0312 12:47:08.214169 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data-merged\") pod \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " Mar 12 12:47:08.214290 master-0 kubenswrapper[13984]: I0312 12:47:08.214274 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wh27\" (UniqueName: \"kubernetes.io/projected/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-kube-api-access-2wh27\") pod \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " Mar 12 12:47:08.215932 master-0 kubenswrapper[13984]: I0312 12:47:08.215906 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data\") pod \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\" (UID: \"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3\") " Mar 12 12:47:08.218204 master-0 kubenswrapper[13984]: I0312 12:47:08.218142 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" (UID: "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:08.223585 master-0 kubenswrapper[13984]: I0312 12:47:08.221775 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-kube-api-access-2wh27" (OuterVolumeSpecName: "kube-api-access-2wh27") pod "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" (UID: "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3"). InnerVolumeSpecName "kube-api-access-2wh27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:08.223585 master-0 kubenswrapper[13984]: I0312 12:47:08.223225 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" (UID: "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 12:47:08.229765 master-0 kubenswrapper[13984]: I0312 12:47:08.229690 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-scripts" (OuterVolumeSpecName: "scripts") pod "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" (UID: "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:08.255376 master-0 kubenswrapper[13984]: I0312 12:47:08.255235 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data" (OuterVolumeSpecName: "config-data") pod "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" (UID: "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:08.291408 master-0 kubenswrapper[13984]: I0312 12:47:08.291300 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" (UID: "27bcfca5-8ad8-4bc5-b95b-0629c699b6e3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:08.319864 master-0 kubenswrapper[13984]: I0312 12:47:08.319810 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:08.319864 master-0 kubenswrapper[13984]: I0312 12:47:08.319851 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:08.319864 master-0 kubenswrapper[13984]: I0312 12:47:08.319860 13984 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:08.319864 master-0 kubenswrapper[13984]: I0312 12:47:08.319870 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:08.319864 master-0 kubenswrapper[13984]: I0312 12:47:08.319881 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wh27\" (UniqueName: \"kubernetes.io/projected/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-kube-api-access-2wh27\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:08.319864 master-0 kubenswrapper[13984]: I0312 12:47:08.319889 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/27bcfca5-8ad8-4bc5-b95b-0629c699b6e3-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:08.634589 master-0 kubenswrapper[13984]: I0312 12:47:08.634536 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-api-0" event={"ID":"5fcc18ae-60e7-48a2-b6ef-9c6d98f38757","Type":"ContainerStarted","Data":"ca1351f098e2c015d53a8b8e8ce2fb78200e1c045905875ebb9aa7d3b6dfbe92"} Mar 12 12:47:08.635164 master-0 kubenswrapper[13984]: I0312 12:47:08.635145 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:08.636413 master-0 kubenswrapper[13984]: I0312 12:47:08.636391 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-ftw7v" event={"ID":"27bcfca5-8ad8-4bc5-b95b-0629c699b6e3","Type":"ContainerDied","Data":"571b965bd681fb7e9d87242725b38ee35935bddca9f6a47491d4f0b27b813caf"} Mar 12 12:47:08.636565 master-0 kubenswrapper[13984]: I0312 12:47:08.636553 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="571b965bd681fb7e9d87242725b38ee35935bddca9f6a47491d4f0b27b813caf" Mar 12 12:47:08.636665 master-0 kubenswrapper[13984]: I0312 12:47:08.636606 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-ftw7v" Mar 12 12:47:08.680160 master-0 kubenswrapper[13984]: I0312 12:47:08.676581 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-api-0" podStartSLOduration=3.676555044 podStartE2EDuration="3.676555044s" podCreationTimestamp="2026-03-12 12:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:08.671896463 +0000 UTC m=+1360.869911955" watchObservedRunningTime="2026-03-12 12:47:08.676555044 +0000 UTC m=+1360.874570536" Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: I0312 12:47:09.101006 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-n5b5j"] Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: E0312 12:47:09.101629 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerName="init" Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: I0312 12:47:09.101647 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerName="init" Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: E0312 12:47:09.101699 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerName="ironic-db-sync" Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: I0312 12:47:09.101706 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerName="ironic-db-sync" Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: I0312 12:47:09.101912 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="27bcfca5-8ad8-4bc5-b95b-0629c699b6e3" containerName="ironic-db-sync" Mar 12 12:47:09.108263 master-0 kubenswrapper[13984]: I0312 12:47:09.102603 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.114994 master-0 kubenswrapper[13984]: I0312 12:47:09.112876 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-n5b5j"] Mar 12 12:47:09.145334 master-0 kubenswrapper[13984]: I0312 12:47:09.142368 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjm6\" (UniqueName: \"kubernetes.io/projected/9990518c-8209-4d3a-aad6-130834718172-kube-api-access-sjjm6\") pod \"ironic-inspector-db-create-n5b5j\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.145334 master-0 kubenswrapper[13984]: I0312 12:47:09.142458 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9990518c-8209-4d3a-aad6-130834718172-operator-scripts\") pod \"ironic-inspector-db-create-n5b5j\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.251533 master-0 kubenswrapper[13984]: I0312 12:47:09.245075 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjm6\" (UniqueName: \"kubernetes.io/projected/9990518c-8209-4d3a-aad6-130834718172-kube-api-access-sjjm6\") pod \"ironic-inspector-db-create-n5b5j\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.251533 master-0 kubenswrapper[13984]: I0312 12:47:09.245133 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9990518c-8209-4d3a-aad6-130834718172-operator-scripts\") pod \"ironic-inspector-db-create-n5b5j\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.251533 master-0 kubenswrapper[13984]: I0312 12:47:09.245985 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9990518c-8209-4d3a-aad6-130834718172-operator-scripts\") pod \"ironic-inspector-db-create-n5b5j\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.272084 master-0 kubenswrapper[13984]: I0312 12:47:09.261822 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-7750-account-create-update-mx9br"] Mar 12 12:47:09.292597 master-0 kubenswrapper[13984]: I0312 12:47:09.290461 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.294042 master-0 kubenswrapper[13984]: I0312 12:47:09.294005 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 12 12:47:09.319876 master-0 kubenswrapper[13984]: I0312 12:47:09.319741 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-7750-account-create-update-mx9br"] Mar 12 12:47:09.335501 master-0 kubenswrapper[13984]: I0312 12:47:09.329839 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjm6\" (UniqueName: \"kubernetes.io/projected/9990518c-8209-4d3a-aad6-130834718172-kube-api-access-sjjm6\") pod \"ironic-inspector-db-create-n5b5j\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.391310 master-0 kubenswrapper[13984]: I0312 12:47:09.356663 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4wlj\" (UniqueName: \"kubernetes.io/projected/4f28f93e-f049-44fe-b721-ff4ad88db2b0-kube-api-access-j4wlj\") pod \"ironic-inspector-7750-account-create-update-mx9br\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.391310 master-0 kubenswrapper[13984]: I0312 12:47:09.357051 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f28f93e-f049-44fe-b721-ff4ad88db2b0-operator-scripts\") pod \"ironic-inspector-7750-account-create-update-mx9br\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.391310 master-0 kubenswrapper[13984]: I0312 12:47:09.376207 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649858dd65-skgqw"] Mar 12 12:47:09.391310 master-0 kubenswrapper[13984]: I0312 12:47:09.376544 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-649858dd65-skgqw" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="dnsmasq-dns" containerID="cri-o://0129114a5a66baabd536135c9b5018f429623b02f0db9f17dc905df1e379b1f6" gracePeriod=10 Mar 12 12:47:09.391310 master-0 kubenswrapper[13984]: I0312 12:47:09.383660 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:09.477374 master-0 kubenswrapper[13984]: I0312 12:47:09.462956 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666f5ccff9-kdhq6"] Mar 12 12:47:09.487087 master-0 kubenswrapper[13984]: I0312 12:47:09.478834 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:09.487087 master-0 kubenswrapper[13984]: I0312 12:47:09.481151 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f28f93e-f049-44fe-b721-ff4ad88db2b0-operator-scripts\") pod \"ironic-inspector-7750-account-create-update-mx9br\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.487087 master-0 kubenswrapper[13984]: I0312 12:47:09.481300 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4wlj\" (UniqueName: \"kubernetes.io/projected/4f28f93e-f049-44fe-b721-ff4ad88db2b0-kube-api-access-j4wlj\") pod \"ironic-inspector-7750-account-create-update-mx9br\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.487087 master-0 kubenswrapper[13984]: I0312 12:47:09.481380 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.487087 master-0 kubenswrapper[13984]: I0312 12:47:09.482682 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f28f93e-f049-44fe-b721-ff4ad88db2b0-operator-scripts\") pod \"ironic-inspector-7750-account-create-update-mx9br\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.565324 master-0 kubenswrapper[13984]: I0312 12:47:09.497402 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666f5ccff9-kdhq6"] Mar 12 12:47:09.565324 master-0 kubenswrapper[13984]: I0312 12:47:09.557273 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-776949857-qhnzl"] Mar 12 12:47:09.576003 master-0 kubenswrapper[13984]: I0312 12:47:09.571264 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4wlj\" (UniqueName: \"kubernetes.io/projected/4f28f93e-f049-44fe-b721-ff4ad88db2b0-kube-api-access-j4wlj\") pod \"ironic-inspector-7750-account-create-update-mx9br\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.614864 master-0 kubenswrapper[13984]: I0312 12:47:09.614807 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.620001 master-0 kubenswrapper[13984]: I0312 12:47:09.619961 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 12 12:47:09.658828 master-0 kubenswrapper[13984]: I0312 12:47:09.632737 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-config\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.658828 master-0 kubenswrapper[13984]: I0312 12:47:09.632871 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-svc\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.658828 master-0 kubenswrapper[13984]: I0312 12:47:09.633010 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-nb\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.658828 master-0 kubenswrapper[13984]: I0312 12:47:09.633269 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-sb\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.658828 master-0 kubenswrapper[13984]: I0312 12:47:09.633428 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-swift-storage-0\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.658828 master-0 kubenswrapper[13984]: I0312 12:47:09.633543 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpc2m\" (UniqueName: \"kubernetes.io/projected/a1893ef7-895c-49c1-bfcc-468af72a46a6-kube-api-access-xpc2m\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.699902 master-0 kubenswrapper[13984]: I0312 12:47:09.699830 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:09.703543 master-0 kubenswrapper[13984]: I0312 12:47:09.701998 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-776949857-qhnzl"] Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.731296 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-59d99f4857-zbcnf"] Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.733887 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.739318 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.740283 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.740547 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.740667 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 12 12:47:09.743813 master-0 kubenswrapper[13984]: I0312 12:47:09.740826 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.749379 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-59d99f4857-zbcnf"] Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758255 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-config\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758351 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-svc\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758439 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cecbed0-f638-495f-a450-ddcb64f6cc30-config\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758551 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-nb\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758591 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-sb\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758630 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kch8x\" (UniqueName: \"kubernetes.io/projected/4cecbed0-f638-495f-a450-ddcb64f6cc30-kube-api-access-kch8x\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758758 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-swift-storage-0\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758831 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpc2m\" (UniqueName: \"kubernetes.io/projected/a1893ef7-895c-49c1-bfcc-468af72a46a6-kube-api-access-xpc2m\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.758857 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cecbed0-f638-495f-a450-ddcb64f6cc30-combined-ca-bundle\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.759816 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-config\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.763552 master-0 kubenswrapper[13984]: I0312 12:47:09.760545 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-sb\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.767464 master-0 kubenswrapper[13984]: I0312 12:47:09.767025 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-swift-storage-0\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.770564 master-0 kubenswrapper[13984]: I0312 12:47:09.770516 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-nb\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.786566 master-0 kubenswrapper[13984]: I0312 12:47:09.786517 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-svc\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.807194 master-0 kubenswrapper[13984]: I0312 12:47:09.807090 13984 generic.go:334] "Generic (PLEG): container finished" podID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerID="0129114a5a66baabd536135c9b5018f429623b02f0db9f17dc905df1e379b1f6" exitCode=0 Mar 12 12:47:09.807396 master-0 kubenswrapper[13984]: I0312 12:47:09.807249 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649858dd65-skgqw" event={"ID":"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09","Type":"ContainerDied","Data":"0129114a5a66baabd536135c9b5018f429623b02f0db9f17dc905df1e379b1f6"} Mar 12 12:47:09.838114 master-0 kubenswrapper[13984]: I0312 12:47:09.836183 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpc2m\" (UniqueName: \"kubernetes.io/projected/a1893ef7-895c-49c1-bfcc-468af72a46a6-kube-api-access-xpc2m\") pod \"dnsmasq-dns-666f5ccff9-kdhq6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.873074 master-0 kubenswrapper[13984]: I0312 12:47:09.872370 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-scripts\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873074 master-0 kubenswrapper[13984]: I0312 12:47:09.872626 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-logs\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873074 master-0 kubenswrapper[13984]: I0312 12:47:09.872755 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kch8x\" (UniqueName: \"kubernetes.io/projected/4cecbed0-f638-495f-a450-ddcb64f6cc30-kube-api-access-kch8x\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.873074 master-0 kubenswrapper[13984]: I0312 12:47:09.872961 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cecbed0-f638-495f-a450-ddcb64f6cc30-combined-ca-bundle\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.873074 master-0 kubenswrapper[13984]: I0312 12:47:09.872995 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/64252643-1a6d-4283-a6be-bdb4adbac235-etc-podinfo\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873412 master-0 kubenswrapper[13984]: I0312 12:47:09.873091 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-combined-ca-bundle\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873412 master-0 kubenswrapper[13984]: I0312 12:47:09.873166 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873412 master-0 kubenswrapper[13984]: I0312 12:47:09.873199 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gr5c2\" (UniqueName: \"kubernetes.io/projected/64252643-1a6d-4283-a6be-bdb4adbac235-kube-api-access-gr5c2\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873412 master-0 kubenswrapper[13984]: I0312 12:47:09.873364 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cecbed0-f638-495f-a450-ddcb64f6cc30-config\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.873412 master-0 kubenswrapper[13984]: I0312 12:47:09.873399 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-merged\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.873579 master-0 kubenswrapper[13984]: I0312 12:47:09.873423 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-custom\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.892752 master-0 kubenswrapper[13984]: I0312 12:47:09.891861 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cecbed0-f638-495f-a450-ddcb64f6cc30-combined-ca-bundle\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.917687 master-0 kubenswrapper[13984]: I0312 12:47:09.917592 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kch8x\" (UniqueName: \"kubernetes.io/projected/4cecbed0-f638-495f-a450-ddcb64f6cc30-kube-api-access-kch8x\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.924119 master-0 kubenswrapper[13984]: I0312 12:47:09.924036 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4cecbed0-f638-495f-a450-ddcb64f6cc30-config\") pod \"ironic-neutron-agent-776949857-qhnzl\" (UID: \"4cecbed0-f638-495f-a450-ddcb64f6cc30\") " pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:09.935837 master-0 kubenswrapper[13984]: I0312 12:47:09.935623 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976618 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/64252643-1a6d-4283-a6be-bdb4adbac235-etc-podinfo\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976702 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-combined-ca-bundle\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976742 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976772 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gr5c2\" (UniqueName: \"kubernetes.io/projected/64252643-1a6d-4283-a6be-bdb4adbac235-kube-api-access-gr5c2\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976866 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-merged\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976883 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-custom\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976935 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-scripts\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.977675 master-0 kubenswrapper[13984]: I0312 12:47:09.976989 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-logs\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.978675 master-0 kubenswrapper[13984]: I0312 12:47:09.978365 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-merged\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:09.985899 master-0 kubenswrapper[13984]: I0312 12:47:09.985858 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-logs\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.034155 master-0 kubenswrapper[13984]: I0312 12:47:10.034085 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/64252643-1a6d-4283-a6be-bdb4adbac235-etc-podinfo\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.037043 master-0 kubenswrapper[13984]: I0312 12:47:10.036926 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-combined-ca-bundle\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.048610 master-0 kubenswrapper[13984]: I0312 12:47:10.044409 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-custom\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.048610 master-0 kubenswrapper[13984]: I0312 12:47:10.045425 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-scripts\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.048610 master-0 kubenswrapper[13984]: I0312 12:47:10.046425 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.052993 master-0 kubenswrapper[13984]: I0312 12:47:10.050631 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gr5c2\" (UniqueName: \"kubernetes.io/projected/64252643-1a6d-4283-a6be-bdb4adbac235-kube-api-access-gr5c2\") pod \"ironic-59d99f4857-zbcnf\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.099669 master-0 kubenswrapper[13984]: I0312 12:47:10.099090 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:10.325526 master-0 kubenswrapper[13984]: I0312 12:47:10.325457 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:10.541737 master-0 kubenswrapper[13984]: I0312 12:47:10.541606 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:10.549724 master-0 kubenswrapper[13984]: I0312 12:47:10.549031 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-n5b5j"] Mar 12 12:47:10.557195 master-0 kubenswrapper[13984]: I0312 12:47:10.557167 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:10.730661 master-0 kubenswrapper[13984]: I0312 12:47:10.730584 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpsqd\" (UniqueName: \"kubernetes.io/projected/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-kube-api-access-qpsqd\") pod \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " Mar 12 12:47:10.731203 master-0 kubenswrapper[13984]: I0312 12:47:10.730733 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-sb\") pod \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " Mar 12 12:47:10.731203 master-0 kubenswrapper[13984]: I0312 12:47:10.730809 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-swift-storage-0\") pod \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " Mar 12 12:47:10.731203 master-0 kubenswrapper[13984]: I0312 12:47:10.730866 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-nb\") pod \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " Mar 12 12:47:10.731203 master-0 kubenswrapper[13984]: I0312 12:47:10.730923 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-config\") pod \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " Mar 12 12:47:10.731203 master-0 kubenswrapper[13984]: I0312 12:47:10.730962 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-svc\") pod \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\" (UID: \"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09\") " Mar 12 12:47:10.732145 master-0 kubenswrapper[13984]: I0312 12:47:10.732063 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:10.742859 master-0 kubenswrapper[13984]: I0312 12:47:10.742796 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:10.746740 master-0 kubenswrapper[13984]: I0312 12:47:10.746691 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-kube-api-access-qpsqd" (OuterVolumeSpecName: "kube-api-access-qpsqd") pod "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" (UID: "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09"). InnerVolumeSpecName "kube-api-access-qpsqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:10.840950 master-0 kubenswrapper[13984]: I0312 12:47:10.834375 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpsqd\" (UniqueName: \"kubernetes.io/projected/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-kube-api-access-qpsqd\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:10.883757 master-0 kubenswrapper[13984]: I0312 12:47:10.883675 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-n5b5j" event={"ID":"9990518c-8209-4d3a-aad6-130834718172","Type":"ContainerStarted","Data":"c3902f97155b33a7cd83667078ebc8cc64d532c9db50f2d28dddfed33a2cd7da"} Mar 12 12:47:10.891440 master-0 kubenswrapper[13984]: I0312 12:47:10.890694 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="cinder-volume" containerID="cri-o://6fccd0c52541641099f33277c03b55357011f80f52c7f0c1ed849de024caef7a" gracePeriod=30 Mar 12 12:47:10.891440 master-0 kubenswrapper[13984]: I0312 12:47:10.890808 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-649858dd65-skgqw" Mar 12 12:47:10.891440 master-0 kubenswrapper[13984]: I0312 12:47:10.891267 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-649858dd65-skgqw" event={"ID":"bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09","Type":"ContainerDied","Data":"9a778661502fd4144e742c88cd48ac41d9beaab32305c78ac59f4f378f4991bb"} Mar 12 12:47:10.891440 master-0 kubenswrapper[13984]: I0312 12:47:10.891297 13984 scope.go:117] "RemoveContainer" containerID="0129114a5a66baabd536135c9b5018f429623b02f0db9f17dc905df1e379b1f6" Mar 12 12:47:10.891440 master-0 kubenswrapper[13984]: I0312 12:47:10.891377 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="probe" containerID="cri-o://8e7c9395476f01caae167cd43716272e9e5128f931102612c91124b2bdccc17f" gracePeriod=30 Mar 12 12:47:10.983366 master-0 kubenswrapper[13984]: I0312 12:47:10.973654 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-7750-account-create-update-mx9br"] Mar 12 12:47:10.983366 master-0 kubenswrapper[13984]: I0312 12:47:10.973757 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:11.021172 master-0 kubenswrapper[13984]: I0312 12:47:11.021119 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" (UID: "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:11.045125 master-0 kubenswrapper[13984]: I0312 12:47:11.044979 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:11.052598 master-0 kubenswrapper[13984]: I0312 12:47:11.052535 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:11.052836 master-0 kubenswrapper[13984]: I0312 12:47:11.052795 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-scheduler-0" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="cinder-scheduler" containerID="cri-o://6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11" gracePeriod=30 Mar 12 12:47:11.053334 master-0 kubenswrapper[13984]: I0312 12:47:11.053303 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-scheduler-0" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="probe" containerID="cri-o://315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411" gracePeriod=30 Mar 12 12:47:11.121328 master-0 kubenswrapper[13984]: I0312 12:47:11.121279 13984 scope.go:117] "RemoveContainer" containerID="2c117435077754597d1f54b4e4047dd1d45a80154a61615dc9286f99ef0294ec" Mar 12 12:47:11.140997 master-0 kubenswrapper[13984]: I0312 12:47:11.140273 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-config" (OuterVolumeSpecName: "config") pod "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" (UID: "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:11.168711 master-0 kubenswrapper[13984]: I0312 12:47:11.147907 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666f5ccff9-kdhq6"] Mar 12 12:47:11.168711 master-0 kubenswrapper[13984]: I0312 12:47:11.153033 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" (UID: "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:11.168711 master-0 kubenswrapper[13984]: I0312 12:47:11.153124 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:11.168711 master-0 kubenswrapper[13984]: I0312 12:47:11.165677 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" (UID: "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:11.258600 master-0 kubenswrapper[13984]: I0312 12:47:11.258554 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:11.258600 master-0 kubenswrapper[13984]: I0312 12:47:11.258594 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:11.267877 master-0 kubenswrapper[13984]: I0312 12:47:11.266712 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" (UID: "bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:11.336917 master-0 kubenswrapper[13984]: I0312 12:47:11.336377 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:11.354716 master-0 kubenswrapper[13984]: I0312 12:47:11.352143 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-776949857-qhnzl"] Mar 12 12:47:11.387503 master-0 kubenswrapper[13984]: I0312 12:47:11.366667 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:11.387503 master-0 kubenswrapper[13984]: I0312 12:47:11.375634 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-59d99f4857-zbcnf"] Mar 12 12:47:11.412449 master-0 kubenswrapper[13984]: I0312 12:47:11.412267 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 12 12:47:11.415639 master-0 kubenswrapper[13984]: E0312 12:47:11.415052 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="dnsmasq-dns" Mar 12 12:47:11.415639 master-0 kubenswrapper[13984]: I0312 12:47:11.415076 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="dnsmasq-dns" Mar 12 12:47:11.415639 master-0 kubenswrapper[13984]: E0312 12:47:11.415156 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="init" Mar 12 12:47:11.415639 master-0 kubenswrapper[13984]: I0312 12:47:11.415185 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="init" Mar 12 12:47:11.419117 master-0 kubenswrapper[13984]: I0312 12:47:11.417103 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="dnsmasq-dns" Mar 12 12:47:11.427596 master-0 kubenswrapper[13984]: I0312 12:47:11.425585 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 12 12:47:11.445730 master-0 kubenswrapper[13984]: I0312 12:47:11.441121 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 12 12:47:11.445730 master-0 kubenswrapper[13984]: I0312 12:47:11.441391 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 12 12:47:11.451498 master-0 kubenswrapper[13984]: I0312 12:47:11.448431 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 12 12:47:11.590546 master-0 kubenswrapper[13984]: I0312 12:47:11.590488 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43e6fb75-b813-4074-8029-d6817b1bb9e2-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.590658 master-0 kubenswrapper[13984]: I0312 12:47:11.590582 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.590658 master-0 kubenswrapper[13984]: I0312 12:47:11.590629 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29p47\" (UniqueName: \"kubernetes.io/projected/43e6fb75-b813-4074-8029-d6817b1bb9e2-kube-api-access-29p47\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.590735 master-0 kubenswrapper[13984]: I0312 12:47:11.590709 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.590774 master-0 kubenswrapper[13984]: I0312 12:47:11.590732 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.590911 master-0 kubenswrapper[13984]: I0312 12:47:11.590872 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.591008 master-0 kubenswrapper[13984]: I0312 12:47:11.590987 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-85e8cc03-fba7-424b-ba6f-a4e1b3868a65\" (UniqueName: \"kubernetes.io/csi/topolvm.io^25391d15-a761-47ce-9ac0-dd2f3ffee772\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.591371 master-0 kubenswrapper[13984]: I0312 12:47:11.591251 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-scripts\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.625894 master-0 kubenswrapper[13984]: I0312 12:47:11.625770 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-649858dd65-skgqw"] Mar 12 12:47:11.638029 master-0 kubenswrapper[13984]: I0312 12:47:11.637953 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-649858dd65-skgqw"] Mar 12 12:47:11.694232 master-0 kubenswrapper[13984]: I0312 12:47:11.694160 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.694232 master-0 kubenswrapper[13984]: I0312 12:47:11.694215 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-85e8cc03-fba7-424b-ba6f-a4e1b3868a65\" (UniqueName: \"kubernetes.io/csi/topolvm.io^25391d15-a761-47ce-9ac0-dd2f3ffee772\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.694464 master-0 kubenswrapper[13984]: I0312 12:47:11.694255 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-scripts\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.694464 master-0 kubenswrapper[13984]: I0312 12:47:11.694370 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43e6fb75-b813-4074-8029-d6817b1bb9e2-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.694464 master-0 kubenswrapper[13984]: I0312 12:47:11.694403 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.694464 master-0 kubenswrapper[13984]: I0312 12:47:11.694446 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29p47\" (UniqueName: \"kubernetes.io/projected/43e6fb75-b813-4074-8029-d6817b1bb9e2-kube-api-access-29p47\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.695349 master-0 kubenswrapper[13984]: I0312 12:47:11.694967 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.695349 master-0 kubenswrapper[13984]: I0312 12:47:11.695048 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.696652 master-0 kubenswrapper[13984]: I0312 12:47:11.696558 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.712051 master-0 kubenswrapper[13984]: I0312 12:47:11.701693 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.712051 master-0 kubenswrapper[13984]: I0312 12:47:11.702543 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-config-data\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.712051 master-0 kubenswrapper[13984]: I0312 12:47:11.704671 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:47:11.712051 master-0 kubenswrapper[13984]: I0312 12:47:11.704709 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-85e8cc03-fba7-424b-ba6f-a4e1b3868a65\" (UniqueName: \"kubernetes.io/csi/topolvm.io^25391d15-a761-47ce-9ac0-dd2f3ffee772\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/73c27e08f38cfc89b8a7fbef0ca9262df27ae3024bf7aef5f829d61ac20b6bb4/globalmount\"" pod="openstack/ironic-conductor-0" Mar 12 12:47:11.713087 master-0 kubenswrapper[13984]: I0312 12:47:11.712996 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/43e6fb75-b813-4074-8029-d6817b1bb9e2-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.730501 master-0 kubenswrapper[13984]: I0312 12:47:11.729888 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29p47\" (UniqueName: \"kubernetes.io/projected/43e6fb75-b813-4074-8029-d6817b1bb9e2-kube-api-access-29p47\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.731437 master-0 kubenswrapper[13984]: I0312 12:47:11.731298 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-scripts\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.741941 master-0 kubenswrapper[13984]: I0312 12:47:11.741883 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e6fb75-b813-4074-8029-d6817b1bb9e2-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:11.935607 master-0 kubenswrapper[13984]: I0312 12:47:11.935554 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerStarted","Data":"b8252ac36025fc4cdb429bad9b3730fd3e3cf80a5d96db41ff05f12c2d4a2975"} Mar 12 12:47:11.937195 master-0 kubenswrapper[13984]: I0312 12:47:11.937002 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" event={"ID":"a1893ef7-895c-49c1-bfcc-468af72a46a6","Type":"ContainerStarted","Data":"1b74648070c6aa6c77655f42dff3b8e27cec60e8cba1149da994ef06d24c3504"} Mar 12 12:47:11.972913 master-0 kubenswrapper[13984]: I0312 12:47:11.972859 13984 generic.go:334] "Generic (PLEG): container finished" podID="9990518c-8209-4d3a-aad6-130834718172" containerID="f5c1d8b233aa8eccc27e8dc66afc4d0f96b92cb23ad8a65f2789d8f127cdd0ad" exitCode=0 Mar 12 12:47:11.973021 master-0 kubenswrapper[13984]: I0312 12:47:11.972964 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-n5b5j" event={"ID":"9990518c-8209-4d3a-aad6-130834718172","Type":"ContainerDied","Data":"f5c1d8b233aa8eccc27e8dc66afc4d0f96b92cb23ad8a65f2789d8f127cdd0ad"} Mar 12 12:47:11.977659 master-0 kubenswrapper[13984]: I0312 12:47:11.976156 13984 generic.go:334] "Generic (PLEG): container finished" podID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerID="8e7c9395476f01caae167cd43716272e9e5128f931102612c91124b2bdccc17f" exitCode=0 Mar 12 12:47:11.977659 master-0 kubenswrapper[13984]: I0312 12:47:11.976197 13984 generic.go:334] "Generic (PLEG): container finished" podID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerID="6fccd0c52541641099f33277c03b55357011f80f52c7f0c1ed849de024caef7a" exitCode=0 Mar 12 12:47:11.977659 master-0 kubenswrapper[13984]: I0312 12:47:11.976255 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"634c7359-7f98-4b5c-b01b-ace6fd3fcf34","Type":"ContainerDied","Data":"8e7c9395476f01caae167cd43716272e9e5128f931102612c91124b2bdccc17f"} Mar 12 12:47:11.977659 master-0 kubenswrapper[13984]: I0312 12:47:11.976292 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"634c7359-7f98-4b5c-b01b-ace6fd3fcf34","Type":"ContainerDied","Data":"6fccd0c52541641099f33277c03b55357011f80f52c7f0c1ed849de024caef7a"} Mar 12 12:47:11.987272 master-0 kubenswrapper[13984]: I0312 12:47:11.986682 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-backup-0" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="cinder-backup" containerID="cri-o://eeb54250e63c3840e474dc512537984d482244e8df6768125caa1d48b8cd85a3" gracePeriod=30 Mar 12 12:47:11.990909 master-0 kubenswrapper[13984]: I0312 12:47:11.988022 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-8c9c7-backup-0" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="probe" containerID="cri-o://e90f7d48875325837b809728e5f30a92e9a080085dac7bd5acce43c5d815fc5e" gracePeriod=30 Mar 12 12:47:12.014567 master-0 kubenswrapper[13984]: I0312 12:47:12.013640 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" path="/var/lib/kubelet/pods/bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09/volumes" Mar 12 12:47:12.014567 master-0 kubenswrapper[13984]: I0312 12:47:12.014291 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-776949857-qhnzl" event={"ID":"4cecbed0-f638-495f-a450-ddcb64f6cc30","Type":"ContainerStarted","Data":"8743eb24fbf06fff5d179b3eb4154349bb2c411ae9569de439fb798216752181"} Mar 12 12:47:12.014567 master-0 kubenswrapper[13984]: I0312 12:47:12.014314 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" event={"ID":"4f28f93e-f049-44fe-b721-ff4ad88db2b0","Type":"ContainerStarted","Data":"276ccab15a48c0b8f07b5a7cd6af3897a008560a290a0cec82c80f1efa5a7413"} Mar 12 12:47:12.014567 master-0 kubenswrapper[13984]: I0312 12:47:12.014326 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" event={"ID":"4f28f93e-f049-44fe-b721-ff4ad88db2b0","Type":"ContainerStarted","Data":"cf361470864c14d7b7b86825471ecf84a4a8d49f1b1eb6cd3d0f42b5ae446c66"} Mar 12 12:47:12.073628 master-0 kubenswrapper[13984]: I0312 12:47:12.073532 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" podStartSLOduration=3.073466989 podStartE2EDuration="3.073466989s" podCreationTimestamp="2026-03-12 12:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:12.041340174 +0000 UTC m=+1364.239355666" watchObservedRunningTime="2026-03-12 12:47:12.073466989 +0000 UTC m=+1364.271482481" Mar 12 12:47:12.358714 master-0 kubenswrapper[13984]: I0312 12:47:12.358551 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:12.541904 master-0 kubenswrapper[13984]: I0312 12:47:12.541848 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-sys\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542114 master-0 kubenswrapper[13984]: I0312 12:47:12.541916 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84fm8\" (UniqueName: \"kubernetes.io/projected/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-kube-api-access-84fm8\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542114 master-0 kubenswrapper[13984]: I0312 12:47:12.541941 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542114 master-0 kubenswrapper[13984]: I0312 12:47:12.541965 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data-custom\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542114 master-0 kubenswrapper[13984]: I0312 12:47:12.542018 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-run\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542114 master-0 kubenswrapper[13984]: I0312 12:47:12.542040 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-lib-cinder\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542114 master-0 kubenswrapper[13984]: I0312 12:47:12.542085 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-iscsi\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542311 master-0 kubenswrapper[13984]: I0312 12:47:12.542125 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-dev\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542311 master-0 kubenswrapper[13984]: I0312 12:47:12.542160 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-nvme\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542428 master-0 kubenswrapper[13984]: I0312 12:47:12.542345 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-cinder\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542428 master-0 kubenswrapper[13984]: I0312 12:47:12.542383 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-brick\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542428 master-0 kubenswrapper[13984]: I0312 12:47:12.542418 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-combined-ca-bundle\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542764 master-0 kubenswrapper[13984]: I0312 12:47:12.542739 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-lib-modules\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542812 master-0 kubenswrapper[13984]: I0312 12:47:12.542782 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-scripts\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542861 master-0 kubenswrapper[13984]: I0312 12:47:12.542835 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-machine-id\") pod \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\" (UID: \"634c7359-7f98-4b5c-b01b-ace6fd3fcf34\") " Mar 12 12:47:12.542861 master-0 kubenswrapper[13984]: I0312 12:47:12.542847 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.543608 master-0 kubenswrapper[13984]: I0312 12:47:12.543565 13984 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.544942 master-0 kubenswrapper[13984]: I0312 12:47:12.544915 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-dev" (OuterVolumeSpecName: "dev") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.545075 master-0 kubenswrapper[13984]: I0312 12:47:12.544966 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.545075 master-0 kubenswrapper[13984]: I0312 12:47:12.544994 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.545075 master-0 kubenswrapper[13984]: I0312 12:47:12.545018 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.557425 master-0 kubenswrapper[13984]: I0312 12:47:12.557351 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-sys" (OuterVolumeSpecName: "sys") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.557425 master-0 kubenswrapper[13984]: I0312 12:47:12.557381 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-run" (OuterVolumeSpecName: "run") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.558000 master-0 kubenswrapper[13984]: I0312 12:47:12.557462 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.559558 master-0 kubenswrapper[13984]: I0312 12:47:12.558079 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.559558 master-0 kubenswrapper[13984]: I0312 12:47:12.558125 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.561124 master-0 kubenswrapper[13984]: I0312 12:47:12.561097 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-kube-api-access-84fm8" (OuterVolumeSpecName: "kube-api-access-84fm8") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "kube-api-access-84fm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:12.563921 master-0 kubenswrapper[13984]: I0312 12:47:12.563870 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.565511 master-0 kubenswrapper[13984]: I0312 12:47:12.565443 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-scripts" (OuterVolumeSpecName: "scripts") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.614139 master-0 kubenswrapper[13984]: I0312 12:47:12.614041 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:12.641238 master-0 kubenswrapper[13984]: I0312 12:47:12.641188 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.658666 master-0 kubenswrapper[13984]: I0312 12:47:12.657145 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data\") pod \"13c0e883-944a-444f-ad81-f88bb8dec00e\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " Mar 12 12:47:12.658666 master-0 kubenswrapper[13984]: I0312 12:47:12.657467 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-combined-ca-bundle\") pod \"13c0e883-944a-444f-ad81-f88bb8dec00e\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " Mar 12 12:47:12.660580 master-0 kubenswrapper[13984]: I0312 12:47:12.660552 13984 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660580 master-0 kubenswrapper[13984]: I0312 12:47:12.660578 13984 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660588 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660597 13984 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660606 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660614 13984 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660625 13984 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-sys\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660638 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84fm8\" (UniqueName: \"kubernetes.io/projected/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-kube-api-access-84fm8\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660649 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660659 13984 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660667 13984 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660675 13984 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-dev\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.660741 master-0 kubenswrapper[13984]: I0312 12:47:12.660683 13984 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.761673 master-0 kubenswrapper[13984]: I0312 12:47:12.761608 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntmgk\" (UniqueName: \"kubernetes.io/projected/13c0e883-944a-444f-ad81-f88bb8dec00e-kube-api-access-ntmgk\") pod \"13c0e883-944a-444f-ad81-f88bb8dec00e\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " Mar 12 12:47:12.762616 master-0 kubenswrapper[13984]: I0312 12:47:12.761707 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data-custom\") pod \"13c0e883-944a-444f-ad81-f88bb8dec00e\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " Mar 12 12:47:12.762616 master-0 kubenswrapper[13984]: I0312 12:47:12.761795 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c0e883-944a-444f-ad81-f88bb8dec00e-etc-machine-id\") pod \"13c0e883-944a-444f-ad81-f88bb8dec00e\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " Mar 12 12:47:12.762616 master-0 kubenswrapper[13984]: I0312 12:47:12.762004 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-scripts\") pod \"13c0e883-944a-444f-ad81-f88bb8dec00e\" (UID: \"13c0e883-944a-444f-ad81-f88bb8dec00e\") " Mar 12 12:47:12.762616 master-0 kubenswrapper[13984]: I0312 12:47:12.761911 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13c0e883-944a-444f-ad81-f88bb8dec00e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13c0e883-944a-444f-ad81-f88bb8dec00e" (UID: "13c0e883-944a-444f-ad81-f88bb8dec00e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:12.763260 master-0 kubenswrapper[13984]: I0312 12:47:12.763187 13984 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13c0e883-944a-444f-ad81-f88bb8dec00e-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.768393 master-0 kubenswrapper[13984]: I0312 12:47:12.768331 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13c0e883-944a-444f-ad81-f88bb8dec00e-kube-api-access-ntmgk" (OuterVolumeSpecName: "kube-api-access-ntmgk") pod "13c0e883-944a-444f-ad81-f88bb8dec00e" (UID: "13c0e883-944a-444f-ad81-f88bb8dec00e"). InnerVolumeSpecName "kube-api-access-ntmgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:12.770653 master-0 kubenswrapper[13984]: I0312 12:47:12.770590 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13c0e883-944a-444f-ad81-f88bb8dec00e" (UID: "13c0e883-944a-444f-ad81-f88bb8dec00e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.775166 master-0 kubenswrapper[13984]: I0312 12:47:12.775082 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data" (OuterVolumeSpecName: "config-data") pod "634c7359-7f98-4b5c-b01b-ace6fd3fcf34" (UID: "634c7359-7f98-4b5c-b01b-ace6fd3fcf34"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.794599 master-0 kubenswrapper[13984]: I0312 12:47:12.794518 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-scripts" (OuterVolumeSpecName: "scripts") pod "13c0e883-944a-444f-ad81-f88bb8dec00e" (UID: "13c0e883-944a-444f-ad81-f88bb8dec00e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.821346 master-0 kubenswrapper[13984]: I0312 12:47:12.819258 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13c0e883-944a-444f-ad81-f88bb8dec00e" (UID: "13c0e883-944a-444f-ad81-f88bb8dec00e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.866700 master-0 kubenswrapper[13984]: I0312 12:47:12.866566 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/634c7359-7f98-4b5c-b01b-ace6fd3fcf34-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.866700 master-0 kubenswrapper[13984]: I0312 12:47:12.866614 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.866700 master-0 kubenswrapper[13984]: I0312 12:47:12.866628 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntmgk\" (UniqueName: \"kubernetes.io/projected/13c0e883-944a-444f-ad81-f88bb8dec00e-kube-api-access-ntmgk\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.866700 master-0 kubenswrapper[13984]: I0312 12:47:12.866641 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.866700 master-0 kubenswrapper[13984]: I0312 12:47:12.866653 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:12.888152 master-0 kubenswrapper[13984]: I0312 12:47:12.888027 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data" (OuterVolumeSpecName: "config-data") pod "13c0e883-944a-444f-ad81-f88bb8dec00e" (UID: "13c0e883-944a-444f-ad81-f88bb8dec00e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:12.967513 master-0 kubenswrapper[13984]: I0312 12:47:12.967454 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13c0e883-944a-444f-ad81-f88bb8dec00e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:13.013341 master-0 kubenswrapper[13984]: I0312 12:47:13.013260 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"634c7359-7f98-4b5c-b01b-ace6fd3fcf34","Type":"ContainerDied","Data":"226ba7589e7220629d1966ba084c24569efd6699287db9487cbdefdcacf723ff"} Mar 12 12:47:13.013543 master-0 kubenswrapper[13984]: I0312 12:47:13.013354 13984 scope.go:117] "RemoveContainer" containerID="8e7c9395476f01caae167cd43716272e9e5128f931102612c91124b2bdccc17f" Mar 12 12:47:13.013543 master-0 kubenswrapper[13984]: I0312 12:47:13.013282 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.026229 master-0 kubenswrapper[13984]: I0312 12:47:13.026169 13984 generic.go:334] "Generic (PLEG): container finished" podID="4f28f93e-f049-44fe-b721-ff4ad88db2b0" containerID="276ccab15a48c0b8f07b5a7cd6af3897a008560a290a0cec82c80f1efa5a7413" exitCode=0 Mar 12 12:47:13.026447 master-0 kubenswrapper[13984]: I0312 12:47:13.026251 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" event={"ID":"4f28f93e-f049-44fe-b721-ff4ad88db2b0","Type":"ContainerDied","Data":"276ccab15a48c0b8f07b5a7cd6af3897a008560a290a0cec82c80f1efa5a7413"} Mar 12 12:47:13.031247 master-0 kubenswrapper[13984]: I0312 12:47:13.031194 13984 generic.go:334] "Generic (PLEG): container finished" podID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerID="315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411" exitCode=0 Mar 12 12:47:13.031247 master-0 kubenswrapper[13984]: I0312 12:47:13.031232 13984 generic.go:334] "Generic (PLEG): container finished" podID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerID="6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11" exitCode=0 Mar 12 12:47:13.031247 master-0 kubenswrapper[13984]: I0312 12:47:13.031236 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"13c0e883-944a-444f-ad81-f88bb8dec00e","Type":"ContainerDied","Data":"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411"} Mar 12 12:47:13.031507 master-0 kubenswrapper[13984]: I0312 12:47:13.031295 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"13c0e883-944a-444f-ad81-f88bb8dec00e","Type":"ContainerDied","Data":"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11"} Mar 12 12:47:13.031507 master-0 kubenswrapper[13984]: I0312 12:47:13.031302 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.031701 master-0 kubenswrapper[13984]: I0312 12:47:13.031313 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"13c0e883-944a-444f-ad81-f88bb8dec00e","Type":"ContainerDied","Data":"b3b9296fce29362ac2c5a49ef3005c1b05cd9732d62ce37aed068a5955a4140a"} Mar 12 12:47:13.036080 master-0 kubenswrapper[13984]: I0312 12:47:13.036043 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerID="7d9d5482a5114e010655542a41042f34fd17740c8c1ca188ba9ee4c8c2937639" exitCode=0 Mar 12 12:47:13.036200 master-0 kubenswrapper[13984]: I0312 12:47:13.036127 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" event={"ID":"a1893ef7-895c-49c1-bfcc-468af72a46a6","Type":"ContainerDied","Data":"7d9d5482a5114e010655542a41042f34fd17740c8c1ca188ba9ee4c8c2937639"} Mar 12 12:47:13.039941 master-0 kubenswrapper[13984]: I0312 12:47:13.039821 13984 generic.go:334] "Generic (PLEG): container finished" podID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerID="e90f7d48875325837b809728e5f30a92e9a080085dac7bd5acce43c5d815fc5e" exitCode=0 Mar 12 12:47:13.039941 master-0 kubenswrapper[13984]: I0312 12:47:13.039892 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"d1ce46a3-0909-4d15-b166-fa480a7e6164","Type":"ContainerDied","Data":"e90f7d48875325837b809728e5f30a92e9a080085dac7bd5acce43c5d815fc5e"} Mar 12 12:47:13.049903 master-0 kubenswrapper[13984]: I0312 12:47:13.047743 13984 scope.go:117] "RemoveContainer" containerID="6fccd0c52541641099f33277c03b55357011f80f52c7f0c1ed849de024caef7a" Mar 12 12:47:13.083874 master-0 kubenswrapper[13984]: I0312 12:47:13.083831 13984 scope.go:117] "RemoveContainer" containerID="315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411" Mar 12 12:47:13.136372 master-0 kubenswrapper[13984]: I0312 12:47:13.136231 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:13.170988 master-0 kubenswrapper[13984]: I0312 12:47:13.170930 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:13.220889 master-0 kubenswrapper[13984]: I0312 12:47:13.219557 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:13.250269 master-0 kubenswrapper[13984]: I0312 12:47:13.249706 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:13.263707 master-0 kubenswrapper[13984]: I0312 12:47:13.263551 13984 scope.go:117] "RemoveContainer" containerID="6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11" Mar 12 12:47:13.301517 master-0 kubenswrapper[13984]: I0312 12:47:13.301435 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.301720 13984 scope.go:117] "RemoveContainer" containerID="315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: E0312 12:47:13.301852 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="cinder-scheduler" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.301867 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="cinder-scheduler" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: E0312 12:47:13.301885 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="cinder-volume" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.301892 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="cinder-volume" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: E0312 12:47:13.301907 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="probe" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.301914 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="probe" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: E0312 12:47:13.301927 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="probe" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.301933 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="probe" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.302163 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="probe" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.302183 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="cinder-scheduler" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.302212 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" containerName="probe" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.302233 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" containerName="cinder-volume" Mar 12 12:47:13.304852 master-0 kubenswrapper[13984]: I0312 12:47:13.303295 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.311265 master-0 kubenswrapper[13984]: I0312 12:47:13.306256 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-volume-lvm-iscsi-config-data" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: E0312 12:47:13.325855 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411\": container with ID starting with 315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411 not found: ID does not exist" containerID="315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.325904 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411"} err="failed to get container status \"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411\": rpc error: code = NotFound desc = could not find container \"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411\": container with ID starting with 315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411 not found: ID does not exist" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.325932 13984 scope.go:117] "RemoveContainer" containerID="6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: E0312 12:47:13.326370 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11\": container with ID starting with 6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11 not found: ID does not exist" containerID="6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.326392 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11"} err="failed to get container status \"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11\": rpc error: code = NotFound desc = could not find container \"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11\": container with ID starting with 6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11 not found: ID does not exist" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.326407 13984 scope.go:117] "RemoveContainer" containerID="315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.326454 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.326754 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411"} err="failed to get container status \"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411\": rpc error: code = NotFound desc = could not find container \"315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411\": container with ID starting with 315f681e8ca06d11d0aa0a9c6d2a3fc7960b9ffa976860f10804d661ad3ce411 not found: ID does not exist" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.326771 13984 scope.go:117] "RemoveContainer" containerID="6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11" Mar 12 12:47:13.327694 master-0 kubenswrapper[13984]: I0312 12:47:13.327102 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11"} err="failed to get container status \"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11\": rpc error: code = NotFound desc = could not find container \"6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11\": container with ID starting with 6add143ea68392d5be976373ad7a0f26fe71e96f5fd95d2cc1bd6ae6702e1c11 not found: ID does not exist" Mar 12 12:47:13.348890 master-0 kubenswrapper[13984]: I0312 12:47:13.348837 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:13.352077 master-0 kubenswrapper[13984]: I0312 12:47:13.352019 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.366406 master-0 kubenswrapper[13984]: I0312 12:47:13.358498 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-scheduler-config-data" Mar 12 12:47:13.367565 master-0 kubenswrapper[13984]: I0312 12:47:13.367155 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:13.502351 master-0 kubenswrapper[13984]: I0312 12:47:13.502272 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-locks-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.502633 master-0 kubenswrapper[13984]: I0312 12:47:13.502455 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-lib-modules\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.502633 master-0 kubenswrapper[13984]: I0312 12:47:13.502553 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-run\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.502724 master-0 kubenswrapper[13984]: I0312 12:47:13.502630 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-combined-ca-bundle\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.502724 master-0 kubenswrapper[13984]: I0312 12:47:13.502684 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-config-data-custom\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.502825 master-0 kubenswrapper[13984]: I0312 12:47:13.502743 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-dev\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.502825 master-0 kubenswrapper[13984]: I0312 12:47:13.502792 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bffc1e39-45d1-496f-b4af-8083ec87dabf-etc-machine-id\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.502825 master-0 kubenswrapper[13984]: I0312 12:47:13.502814 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmprt\" (UniqueName: \"kubernetes.io/projected/bffc1e39-45d1-496f-b4af-8083ec87dabf-kube-api-access-hmprt\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.502956 master-0 kubenswrapper[13984]: I0312 12:47:13.502842 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-iscsi\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.502956 master-0 kubenswrapper[13984]: I0312 12:47:13.502888 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-config-data-custom\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.503041 master-0 kubenswrapper[13984]: I0312 12:47:13.502960 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-config-data\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.503041 master-0 kubenswrapper[13984]: I0312 12:47:13.503026 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-locks-brick\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504525 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sktlv\" (UniqueName: \"kubernetes.io/projected/e688c798-286e-49cf-93f7-9c95e822b759-kube-api-access-sktlv\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504562 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-nvme\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504601 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-config-data\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504651 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-lib-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504673 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-scripts\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504699 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-machine-id\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504723 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-sys\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504740 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-combined-ca-bundle\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.505403 master-0 kubenswrapper[13984]: I0312 12:47:13.504842 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-scripts\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.586726 master-0 kubenswrapper[13984]: I0312 12:47:13.586675 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-85e8cc03-fba7-424b-ba6f-a4e1b3868a65\" (UniqueName: \"kubernetes.io/csi/topolvm.io^25391d15-a761-47ce-9ac0-dd2f3ffee772\") pod \"ironic-conductor-0\" (UID: \"43e6fb75-b813-4074-8029-d6817b1bb9e2\") " pod="openstack/ironic-conductor-0" Mar 12 12:47:13.607120 master-0 kubenswrapper[13984]: I0312 12:47:13.607057 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sktlv\" (UniqueName: \"kubernetes.io/projected/e688c798-286e-49cf-93f7-9c95e822b759-kube-api-access-sktlv\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607303 master-0 kubenswrapper[13984]: I0312 12:47:13.607125 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-nvme\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607303 master-0 kubenswrapper[13984]: I0312 12:47:13.607175 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-config-data\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607303 master-0 kubenswrapper[13984]: I0312 12:47:13.607228 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-lib-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607303 master-0 kubenswrapper[13984]: I0312 12:47:13.607257 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-scripts\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.607303 master-0 kubenswrapper[13984]: I0312 12:47:13.607286 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-machine-id\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607473 master-0 kubenswrapper[13984]: I0312 12:47:13.607314 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-sys\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607473 master-0 kubenswrapper[13984]: I0312 12:47:13.607337 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-combined-ca-bundle\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.607473 master-0 kubenswrapper[13984]: I0312 12:47:13.607383 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-scripts\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607473 master-0 kubenswrapper[13984]: I0312 12:47:13.607422 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-locks-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607473 master-0 kubenswrapper[13984]: I0312 12:47:13.607458 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-lib-modules\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607642 master-0 kubenswrapper[13984]: I0312 12:47:13.607507 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-run\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607642 master-0 kubenswrapper[13984]: I0312 12:47:13.607547 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-combined-ca-bundle\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607642 master-0 kubenswrapper[13984]: I0312 12:47:13.607574 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-config-data-custom\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.607642 master-0 kubenswrapper[13984]: I0312 12:47:13.607604 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-dev\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.607642 master-0 kubenswrapper[13984]: I0312 12:47:13.607633 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bffc1e39-45d1-496f-b4af-8083ec87dabf-etc-machine-id\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.607781 master-0 kubenswrapper[13984]: I0312 12:47:13.607656 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmprt\" (UniqueName: \"kubernetes.io/projected/bffc1e39-45d1-496f-b4af-8083ec87dabf-kube-api-access-hmprt\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.608140 master-0 kubenswrapper[13984]: I0312 12:47:13.608087 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-nvme\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.608307 master-0 kubenswrapper[13984]: I0312 12:47:13.608247 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/bffc1e39-45d1-496f-b4af-8083ec87dabf-etc-machine-id\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.608844 master-0 kubenswrapper[13984]: I0312 12:47:13.608553 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-locks-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.608975 master-0 kubenswrapper[13984]: I0312 12:47:13.608340 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-lib-modules\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609177 master-0 kubenswrapper[13984]: I0312 12:47:13.609159 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-lib-cinder\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609418 master-0 kubenswrapper[13984]: I0312 12:47:13.609379 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-machine-id\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609512 master-0 kubenswrapper[13984]: I0312 12:47:13.609452 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-sys\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609512 master-0 kubenswrapper[13984]: I0312 12:47:13.609493 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-dev\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609591 master-0 kubenswrapper[13984]: I0312 12:47:13.609516 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-run\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609591 master-0 kubenswrapper[13984]: I0312 12:47:13.609545 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-iscsi\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609654 master-0 kubenswrapper[13984]: I0312 12:47:13.609598 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-config-data-custom\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609692 master-0 kubenswrapper[13984]: I0312 12:47:13.609668 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-config-data\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.609728 master-0 kubenswrapper[13984]: I0312 12:47:13.609695 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-etc-iscsi\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.609834 master-0 kubenswrapper[13984]: I0312 12:47:13.609810 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-locks-brick\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.610110 master-0 kubenswrapper[13984]: I0312 12:47:13.610087 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e688c798-286e-49cf-93f7-9c95e822b759-var-locks-brick\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.616692 master-0 kubenswrapper[13984]: I0312 12:47:13.616644 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-config-data-custom\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.618064 master-0 kubenswrapper[13984]: I0312 12:47:13.618014 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-config-data-custom\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.620681 master-0 kubenswrapper[13984]: I0312 12:47:13.619077 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-combined-ca-bundle\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.621269 master-0 kubenswrapper[13984]: I0312 12:47:13.620928 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-combined-ca-bundle\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.621269 master-0 kubenswrapper[13984]: I0312 12:47:13.621198 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-config-data\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.623578 master-0 kubenswrapper[13984]: I0312 12:47:13.621830 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e688c798-286e-49cf-93f7-9c95e822b759-scripts\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.623578 master-0 kubenswrapper[13984]: I0312 12:47:13.622594 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-config-data\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.623578 master-0 kubenswrapper[13984]: I0312 12:47:13.622981 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bffc1e39-45d1-496f-b4af-8083ec87dabf-scripts\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.628492 master-0 kubenswrapper[13984]: I0312 12:47:13.628377 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmprt\" (UniqueName: \"kubernetes.io/projected/bffc1e39-45d1-496f-b4af-8083ec87dabf-kube-api-access-hmprt\") pod \"cinder-8c9c7-scheduler-0\" (UID: \"bffc1e39-45d1-496f-b4af-8083ec87dabf\") " pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.632255 master-0 kubenswrapper[13984]: I0312 12:47:13.632205 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sktlv\" (UniqueName: \"kubernetes.io/projected/e688c798-286e-49cf-93f7-9c95e822b759-kube-api-access-sktlv\") pod \"cinder-8c9c7-volume-lvm-iscsi-0\" (UID: \"e688c798-286e-49cf-93f7-9c95e822b759\") " pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.686535 master-0 kubenswrapper[13984]: I0312 12:47:13.686417 13984 trace.go:236] Trace[755455974]: "Calculate volume metrics of mysql-db for pod openstack/openstack-galera-0" (12-Mar-2026 12:47:12.306) (total time: 1380ms): Mar 12 12:47:13.686535 master-0 kubenswrapper[13984]: Trace[755455974]: [1.380097559s] [1.380097559s] END Mar 12 12:47:13.696992 master-0 kubenswrapper[13984]: I0312 12:47:13.696954 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:13.864161 master-0 kubenswrapper[13984]: I0312 12:47:13.863972 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 12 12:47:13.908840 master-0 kubenswrapper[13984]: I0312 12:47:13.908018 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-5bc96c7dbf-x46lx"] Mar 12 12:47:13.920749 master-0 kubenswrapper[13984]: I0312 12:47:13.920694 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:13.930794 master-0 kubenswrapper[13984]: I0312 12:47:13.925781 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:13.953302 master-0 kubenswrapper[13984]: I0312 12:47:13.953131 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 12 12:47:13.954052 master-0 kubenswrapper[13984]: I0312 12:47:13.953794 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 12 12:47:13.961856 master-0 kubenswrapper[13984]: I0312 12:47:13.961401 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-5bc96c7dbf-x46lx"] Mar 12 12:47:14.011532 master-0 kubenswrapper[13984]: I0312 12:47:14.009226 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13c0e883-944a-444f-ad81-f88bb8dec00e" path="/var/lib/kubelet/pods/13c0e883-944a-444f-ad81-f88bb8dec00e/volumes" Mar 12 12:47:14.011532 master-0 kubenswrapper[13984]: I0312 12:47:14.010193 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="634c7359-7f98-4b5c-b01b-ace6fd3fcf34" path="/var/lib/kubelet/pods/634c7359-7f98-4b5c-b01b-ace6fd3fcf34/volumes" Mar 12 12:47:14.062129 master-0 kubenswrapper[13984]: I0312 12:47:14.062034 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" event={"ID":"a1893ef7-895c-49c1-bfcc-468af72a46a6","Type":"ContainerStarted","Data":"19e0c2ceb1955d1c076fe745679ae61a2f682c3d2b1f903741f2b70bad93b5dc"} Mar 12 12:47:14.062408 master-0 kubenswrapper[13984]: I0312 12:47:14.062329 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:14.085282 master-0 kubenswrapper[13984]: I0312 12:47:14.085174 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" podStartSLOduration=5.085156821 podStartE2EDuration="5.085156821s" podCreationTimestamp="2026-03-12 12:47:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:14.084297426 +0000 UTC m=+1366.282312938" watchObservedRunningTime="2026-03-12 12:47:14.085156821 +0000 UTC m=+1366.283172313" Mar 12 12:47:14.121293 master-0 kubenswrapper[13984]: I0312 12:47:14.120846 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-internal-tls-certs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.121446 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122232 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-combined-ca-bundle\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122256 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-logs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122278 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-etc-podinfo\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122339 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdmq2\" (UniqueName: \"kubernetes.io/projected/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-kube-api-access-fdmq2\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122360 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-public-tls-certs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122415 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-scripts\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122484 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data-merged\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.122648 master-0 kubenswrapper[13984]: I0312 12:47:14.122502 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data-custom\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.224945 master-0 kubenswrapper[13984]: I0312 12:47:14.224702 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data-merged\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.224945 master-0 kubenswrapper[13984]: I0312 12:47:14.224770 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data-custom\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.224945 master-0 kubenswrapper[13984]: I0312 12:47:14.224870 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-internal-tls-certs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.224945 master-0 kubenswrapper[13984]: I0312 12:47:14.224914 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.225257 master-0 kubenswrapper[13984]: I0312 12:47:14.224963 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-combined-ca-bundle\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.225257 master-0 kubenswrapper[13984]: I0312 12:47:14.224983 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-logs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.225257 master-0 kubenswrapper[13984]: I0312 12:47:14.225003 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-etc-podinfo\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.225257 master-0 kubenswrapper[13984]: I0312 12:47:14.225045 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdmq2\" (UniqueName: \"kubernetes.io/projected/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-kube-api-access-fdmq2\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.225257 master-0 kubenswrapper[13984]: I0312 12:47:14.225063 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-public-tls-certs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.225257 master-0 kubenswrapper[13984]: I0312 12:47:14.225116 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-scripts\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.227213 master-0 kubenswrapper[13984]: I0312 12:47:14.227161 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-logs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.228934 master-0 kubenswrapper[13984]: I0312 12:47:14.228725 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data-merged\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.231511 master-0 kubenswrapper[13984]: I0312 12:47:14.231443 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-scripts\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.235900 master-0 kubenswrapper[13984]: I0312 12:47:14.235178 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data-custom\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.235900 master-0 kubenswrapper[13984]: I0312 12:47:14.235743 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-internal-tls-certs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.235900 master-0 kubenswrapper[13984]: I0312 12:47:14.235821 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-config-data\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.236998 master-0 kubenswrapper[13984]: I0312 12:47:14.236874 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-combined-ca-bundle\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.241585 master-0 kubenswrapper[13984]: I0312 12:47:14.241203 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-public-tls-certs\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.248619 master-0 kubenswrapper[13984]: I0312 12:47:14.248554 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-etc-podinfo\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.262897 master-0 kubenswrapper[13984]: I0312 12:47:14.262833 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdmq2\" (UniqueName: \"kubernetes.io/projected/431f12da-d4a8-4748-87d1-4ffdd0b1ecc4-kube-api-access-fdmq2\") pod \"ironic-5bc96c7dbf-x46lx\" (UID: \"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4\") " pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.341925 master-0 kubenswrapper[13984]: I0312 12:47:14.341286 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:14.417635 master-0 kubenswrapper[13984]: I0312 12:47:14.417299 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:14.548559 master-0 kubenswrapper[13984]: I0312 12:47:14.548450 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9990518c-8209-4d3a-aad6-130834718172-operator-scripts\") pod \"9990518c-8209-4d3a-aad6-130834718172\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " Mar 12 12:47:14.549026 master-0 kubenswrapper[13984]: I0312 12:47:14.548852 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjjm6\" (UniqueName: \"kubernetes.io/projected/9990518c-8209-4d3a-aad6-130834718172-kube-api-access-sjjm6\") pod \"9990518c-8209-4d3a-aad6-130834718172\" (UID: \"9990518c-8209-4d3a-aad6-130834718172\") " Mar 12 12:47:14.550399 master-0 kubenswrapper[13984]: I0312 12:47:14.550148 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9990518c-8209-4d3a-aad6-130834718172-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9990518c-8209-4d3a-aad6-130834718172" (UID: "9990518c-8209-4d3a-aad6-130834718172"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:14.552856 master-0 kubenswrapper[13984]: I0312 12:47:14.551137 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9990518c-8209-4d3a-aad6-130834718172-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:14.560350 master-0 kubenswrapper[13984]: I0312 12:47:14.560289 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9990518c-8209-4d3a-aad6-130834718172-kube-api-access-sjjm6" (OuterVolumeSpecName: "kube-api-access-sjjm6") pod "9990518c-8209-4d3a-aad6-130834718172" (UID: "9990518c-8209-4d3a-aad6-130834718172"). InnerVolumeSpecName "kube-api-access-sjjm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:14.654067 master-0 kubenswrapper[13984]: I0312 12:47:14.653958 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjjm6\" (UniqueName: \"kubernetes.io/projected/9990518c-8209-4d3a-aad6-130834718172-kube-api-access-sjjm6\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:15.123105 master-0 kubenswrapper[13984]: I0312 12:47:15.122974 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-n5b5j" Mar 12 12:47:15.133506 master-0 kubenswrapper[13984]: I0312 12:47:15.125591 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-n5b5j" event={"ID":"9990518c-8209-4d3a-aad6-130834718172","Type":"ContainerDied","Data":"c3902f97155b33a7cd83667078ebc8cc64d532c9db50f2d28dddfed33a2cd7da"} Mar 12 12:47:15.133506 master-0 kubenswrapper[13984]: I0312 12:47:15.125653 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3902f97155b33a7cd83667078ebc8cc64d532c9db50f2d28dddfed33a2cd7da" Mar 12 12:47:15.377446 master-0 kubenswrapper[13984]: I0312 12:47:15.377293 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-649858dd65-skgqw" podUID="bf4bf0a7-9ede-4ccf-a7c9-aeeaaef2ac09" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.212:5353: i/o timeout" Mar 12 12:47:15.762946 master-0 kubenswrapper[13984]: I0312 12:47:15.761815 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:15.904560 master-0 kubenswrapper[13984]: I0312 12:47:15.904439 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4wlj\" (UniqueName: \"kubernetes.io/projected/4f28f93e-f049-44fe-b721-ff4ad88db2b0-kube-api-access-j4wlj\") pod \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " Mar 12 12:47:15.904778 master-0 kubenswrapper[13984]: I0312 12:47:15.904754 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f28f93e-f049-44fe-b721-ff4ad88db2b0-operator-scripts\") pod \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\" (UID: \"4f28f93e-f049-44fe-b721-ff4ad88db2b0\") " Mar 12 12:47:15.906292 master-0 kubenswrapper[13984]: I0312 12:47:15.906140 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f28f93e-f049-44fe-b721-ff4ad88db2b0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f28f93e-f049-44fe-b721-ff4ad88db2b0" (UID: "4f28f93e-f049-44fe-b721-ff4ad88db2b0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:15.942781 master-0 kubenswrapper[13984]: I0312 12:47:15.942720 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f28f93e-f049-44fe-b721-ff4ad88db2b0-kube-api-access-j4wlj" (OuterVolumeSpecName: "kube-api-access-j4wlj") pod "4f28f93e-f049-44fe-b721-ff4ad88db2b0" (UID: "4f28f93e-f049-44fe-b721-ff4ad88db2b0"). InnerVolumeSpecName "kube-api-access-j4wlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:16.008255 master-0 kubenswrapper[13984]: I0312 12:47:16.008178 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f28f93e-f049-44fe-b721-ff4ad88db2b0-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:16.008255 master-0 kubenswrapper[13984]: I0312 12:47:16.008221 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4wlj\" (UniqueName: \"kubernetes.io/projected/4f28f93e-f049-44fe-b721-ff4ad88db2b0-kube-api-access-j4wlj\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:16.146378 master-0 kubenswrapper[13984]: I0312 12:47:16.144852 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" event={"ID":"4f28f93e-f049-44fe-b721-ff4ad88db2b0","Type":"ContainerDied","Data":"cf361470864c14d7b7b86825471ecf84a4a8d49f1b1eb6cd3d0f42b5ae446c66"} Mar 12 12:47:16.146378 master-0 kubenswrapper[13984]: I0312 12:47:16.144935 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf361470864c14d7b7b86825471ecf84a4a8d49f1b1eb6cd3d0f42b5ae446c66" Mar 12 12:47:16.146378 master-0 kubenswrapper[13984]: I0312 12:47:16.145020 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-7750-account-create-update-mx9br" Mar 12 12:47:16.531366 master-0 kubenswrapper[13984]: I0312 12:47:16.524864 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 12 12:47:16.759964 master-0 kubenswrapper[13984]: I0312 12:47:16.759911 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:47:16.793612 master-0 kubenswrapper[13984]: E0312 12:47:16.785857 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod64252643_1a6d_4283_a6be_bdb4adbac235.slice/crio-2ae84dcc424826dece2a433c245097faf7d15c2363c7bfb4286e140326b977d3.scope\": RecentStats: unable to find data in memory cache]" Mar 12 12:47:16.793612 master-0 kubenswrapper[13984]: I0312 12:47:16.786869 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-scheduler-0"] Mar 12 12:47:16.816733 master-0 kubenswrapper[13984]: W0312 12:47:16.815911 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbffc1e39_45d1_496f_b4af_8083ec87dabf.slice/crio-3b62793ef813888a6054d9a427ee77eff755c479fd327b860ee78630d1c05bd9 WatchSource:0}: Error finding container 3b62793ef813888a6054d9a427ee77eff755c479fd327b860ee78630d1c05bd9: Status 404 returned error can't find the container with id 3b62793ef813888a6054d9a427ee77eff755c479fd327b860ee78630d1c05bd9 Mar 12 12:47:16.861761 master-0 kubenswrapper[13984]: I0312 12:47:16.854505 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-volume-lvm-iscsi-0"] Mar 12 12:47:16.888545 master-0 kubenswrapper[13984]: I0312 12:47:16.888183 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:47:17.098075 master-0 kubenswrapper[13984]: I0312 12:47:17.098019 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-5bc96c7dbf-x46lx"] Mar 12 12:47:17.214504 master-0 kubenswrapper[13984]: I0312 12:47:17.211700 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"82d8d5e1b1976485d12dcb1ae67aa84a0dd911eca62b6aec5482c23c9d2980e0"} Mar 12 12:47:17.214504 master-0 kubenswrapper[13984]: W0312 12:47:17.212731 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod431f12da_d4a8_4748_87d1_4ffdd0b1ecc4.slice/crio-bab9014fa880ab4cccdda5c16ae732f0f15a3eabf6fbb7171c4fed44538f03e5 WatchSource:0}: Error finding container bab9014fa880ab4cccdda5c16ae732f0f15a3eabf6fbb7171c4fed44538f03e5: Status 404 returned error can't find the container with id bab9014fa880ab4cccdda5c16ae732f0f15a3eabf6fbb7171c4fed44538f03e5 Mar 12 12:47:17.273098 master-0 kubenswrapper[13984]: I0312 12:47:17.272937 13984 generic.go:334] "Generic (PLEG): container finished" podID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerID="eeb54250e63c3840e474dc512537984d482244e8df6768125caa1d48b8cd85a3" exitCode=0 Mar 12 12:47:17.273098 master-0 kubenswrapper[13984]: I0312 12:47:17.273041 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"d1ce46a3-0909-4d15-b166-fa480a7e6164","Type":"ContainerDied","Data":"eeb54250e63c3840e474dc512537984d482244e8df6768125caa1d48b8cd85a3"} Mar 12 12:47:17.283735 master-0 kubenswrapper[13984]: I0312 12:47:17.281756 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-776949857-qhnzl" event={"ID":"4cecbed0-f638-495f-a450-ddcb64f6cc30","Type":"ContainerStarted","Data":"9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f"} Mar 12 12:47:17.283735 master-0 kubenswrapper[13984]: I0312 12:47:17.283206 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:17.304160 master-0 kubenswrapper[13984]: I0312 12:47:17.304091 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"e688c798-286e-49cf-93f7-9c95e822b759","Type":"ContainerStarted","Data":"851f0ce34862c950a5e4e19958ca15b19fce6cea450b4d03e8ab4f15df195a21"} Mar 12 12:47:17.309861 master-0 kubenswrapper[13984]: I0312 12:47:17.306716 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"bffc1e39-45d1-496f-b4af-8083ec87dabf","Type":"ContainerStarted","Data":"3b62793ef813888a6054d9a427ee77eff755c479fd327b860ee78630d1c05bd9"} Mar 12 12:47:17.309861 master-0 kubenswrapper[13984]: I0312 12:47:17.308382 13984 generic.go:334] "Generic (PLEG): container finished" podID="64252643-1a6d-4283-a6be-bdb4adbac235" containerID="2ae84dcc424826dece2a433c245097faf7d15c2363c7bfb4286e140326b977d3" exitCode=0 Mar 12 12:47:17.310072 master-0 kubenswrapper[13984]: I0312 12:47:17.310022 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerDied","Data":"2ae84dcc424826dece2a433c245097faf7d15c2363c7bfb4286e140326b977d3"} Mar 12 12:47:17.326425 master-0 kubenswrapper[13984]: I0312 12:47:17.321355 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6bdc8876c8-5m8cl"] Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: E0312 12:47:17.327761 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f28f93e-f049-44fe-b721-ff4ad88db2b0" containerName="mariadb-account-create-update" Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: I0312 12:47:17.327813 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f28f93e-f049-44fe-b721-ff4ad88db2b0" containerName="mariadb-account-create-update" Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: E0312 12:47:17.327904 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9990518c-8209-4d3a-aad6-130834718172" containerName="mariadb-database-create" Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: I0312 12:47:17.327912 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="9990518c-8209-4d3a-aad6-130834718172" containerName="mariadb-database-create" Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: I0312 12:47:17.328277 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="9990518c-8209-4d3a-aad6-130834718172" containerName="mariadb-database-create" Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: I0312 12:47:17.328312 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f28f93e-f049-44fe-b721-ff4ad88db2b0" containerName="mariadb-account-create-update" Mar 12 12:47:17.333290 master-0 kubenswrapper[13984]: I0312 12:47:17.329561 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.365699 master-0 kubenswrapper[13984]: I0312 12:47:17.365636 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bdc8876c8-5m8cl"] Mar 12 12:47:17.387622 master-0 kubenswrapper[13984]: I0312 12:47:17.387155 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-776949857-qhnzl" podStartSLOduration=4.060233747 podStartE2EDuration="8.387131281s" podCreationTimestamp="2026-03-12 12:47:09 +0000 UTC" firstStartedPulling="2026-03-12 12:47:11.331808046 +0000 UTC m=+1363.529823538" lastFinishedPulling="2026-03-12 12:47:15.65870558 +0000 UTC m=+1367.856721072" observedRunningTime="2026-03-12 12:47:17.331136494 +0000 UTC m=+1369.529151986" watchObservedRunningTime="2026-03-12 12:47:17.387131281 +0000 UTC m=+1369.585146773" Mar 12 12:47:17.392037 master-0 kubenswrapper[13984]: I0312 12:47:17.391986 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:17.484265 master-0 kubenswrapper[13984]: I0312 12:47:17.484210 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-internal-tls-certs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.484438 master-0 kubenswrapper[13984]: I0312 12:47:17.484288 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fafd3eff-630a-4ba3-b901-41d3de58bc54-logs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.484438 master-0 kubenswrapper[13984]: I0312 12:47:17.484307 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-scripts\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.484438 master-0 kubenswrapper[13984]: I0312 12:47:17.484380 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-combined-ca-bundle\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.484579 master-0 kubenswrapper[13984]: I0312 12:47:17.484465 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-config-data\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.489624 master-0 kubenswrapper[13984]: I0312 12:47:17.487892 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-public-tls-certs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.489624 master-0 kubenswrapper[13984]: I0312 12:47:17.487984 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqfqn\" (UniqueName: \"kubernetes.io/projected/fafd3eff-630a-4ba3-b901-41d3de58bc54-kube-api-access-xqfqn\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.589953 master-0 kubenswrapper[13984]: I0312 12:47:17.589889 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-combined-ca-bundle\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590161 master-0 kubenswrapper[13984]: I0312 12:47:17.590038 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590161 master-0 kubenswrapper[13984]: I0312 12:47:17.590117 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-iscsi\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590161 master-0 kubenswrapper[13984]: I0312 12:47:17.590148 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-machine-id\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590293 master-0 kubenswrapper[13984]: I0312 12:47:17.590186 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-scripts\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590293 master-0 kubenswrapper[13984]: I0312 12:47:17.590212 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-brick\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590293 master-0 kubenswrapper[13984]: I0312 12:47:17.590248 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-dev\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590452 master-0 kubenswrapper[13984]: I0312 12:47:17.590309 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-sys\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590452 master-0 kubenswrapper[13984]: I0312 12:47:17.590326 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-lib-cinder\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590452 master-0 kubenswrapper[13984]: I0312 12:47:17.590365 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-nvme\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590452 master-0 kubenswrapper[13984]: I0312 12:47:17.590382 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-cinder\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590452 master-0 kubenswrapper[13984]: I0312 12:47:17.590403 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data-custom\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.590452 master-0 kubenswrapper[13984]: I0312 12:47:17.590434 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srvfm\" (UniqueName: \"kubernetes.io/projected/d1ce46a3-0909-4d15-b166-fa480a7e6164-kube-api-access-srvfm\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.591539 master-0 kubenswrapper[13984]: I0312 12:47:17.591508 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-dev" (OuterVolumeSpecName: "dev") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.591794 master-0 kubenswrapper[13984]: I0312 12:47:17.591738 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592021 master-0 kubenswrapper[13984]: I0312 12:47:17.591822 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592021 master-0 kubenswrapper[13984]: I0312 12:47:17.591861 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592021 master-0 kubenswrapper[13984]: I0312 12:47:17.591891 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-sys" (OuterVolumeSpecName: "sys") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592021 master-0 kubenswrapper[13984]: I0312 12:47:17.591916 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592021 master-0 kubenswrapper[13984]: I0312 12:47:17.591944 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592279 master-0 kubenswrapper[13984]: I0312 12:47:17.592176 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592351 master-0 kubenswrapper[13984]: I0312 12:47:17.592322 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-run" (OuterVolumeSpecName: "run") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.592439 master-0 kubenswrapper[13984]: I0312 12:47:17.592354 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-run\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.592439 master-0 kubenswrapper[13984]: I0312 12:47:17.592401 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-lib-modules\") pod \"d1ce46a3-0909-4d15-b166-fa480a7e6164\" (UID: \"d1ce46a3-0909-4d15-b166-fa480a7e6164\") " Mar 12 12:47:17.592910 master-0 kubenswrapper[13984]: I0312 12:47:17.592883 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-public-tls-certs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593103 master-0 kubenswrapper[13984]: I0312 12:47:17.592930 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqfqn\" (UniqueName: \"kubernetes.io/projected/fafd3eff-630a-4ba3-b901-41d3de58bc54-kube-api-access-xqfqn\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593161 master-0 kubenswrapper[13984]: I0312 12:47:17.593117 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-internal-tls-certs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593212 master-0 kubenswrapper[13984]: I0312 12:47:17.593185 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fafd3eff-630a-4ba3-b901-41d3de58bc54-logs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593256 master-0 kubenswrapper[13984]: I0312 12:47:17.593220 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-scripts\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593400 master-0 kubenswrapper[13984]: I0312 12:47:17.593372 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-combined-ca-bundle\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593589 master-0 kubenswrapper[13984]: I0312 12:47:17.593562 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-config-data\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.593853 master-0 kubenswrapper[13984]: I0312 12:47:17.593811 13984 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593853 master-0 kubenswrapper[13984]: I0312 12:47:17.593842 13984 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593858 13984 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593874 13984 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-dev\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593883 13984 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-sys\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593892 13984 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593900 13984 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593909 13984 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.593953 master-0 kubenswrapper[13984]: I0312 12:47:17.593918 13984 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.598834 master-0 kubenswrapper[13984]: I0312 12:47:17.598799 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 12 12:47:17.600499 master-0 kubenswrapper[13984]: I0312 12:47:17.599944 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:17.600499 master-0 kubenswrapper[13984]: I0312 12:47:17.600292 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-scripts" (OuterVolumeSpecName: "scripts") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:17.600734 master-0 kubenswrapper[13984]: I0312 12:47:17.600711 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-scripts\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.601123 master-0 kubenswrapper[13984]: I0312 12:47:17.601095 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fafd3eff-630a-4ba3-b901-41d3de58bc54-logs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.610670 master-0 kubenswrapper[13984]: I0312 12:47:17.607118 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-config-data\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.610850 master-0 kubenswrapper[13984]: I0312 12:47:17.610765 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1ce46a3-0909-4d15-b166-fa480a7e6164-kube-api-access-srvfm" (OuterVolumeSpecName: "kube-api-access-srvfm") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "kube-api-access-srvfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:17.621408 master-0 kubenswrapper[13984]: I0312 12:47:17.621353 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-internal-tls-certs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.625253 master-0 kubenswrapper[13984]: I0312 12:47:17.625203 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-combined-ca-bundle\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.628630 master-0 kubenswrapper[13984]: I0312 12:47:17.628545 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fafd3eff-630a-4ba3-b901-41d3de58bc54-public-tls-certs\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.629354 master-0 kubenswrapper[13984]: I0312 12:47:17.629305 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqfqn\" (UniqueName: \"kubernetes.io/projected/fafd3eff-630a-4ba3-b901-41d3de58bc54-kube-api-access-xqfqn\") pod \"placement-6bdc8876c8-5m8cl\" (UID: \"fafd3eff-630a-4ba3-b901-41d3de58bc54\") " pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.691560 master-0 kubenswrapper[13984]: I0312 12:47:17.691458 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:17.696471 master-0 kubenswrapper[13984]: I0312 12:47:17.696028 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.696471 master-0 kubenswrapper[13984]: I0312 12:47:17.696060 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.696471 master-0 kubenswrapper[13984]: I0312 12:47:17.696072 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srvfm\" (UniqueName: \"kubernetes.io/projected/d1ce46a3-0909-4d15-b166-fa480a7e6164-kube-api-access-srvfm\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.696471 master-0 kubenswrapper[13984]: I0312 12:47:17.696080 13984 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d1ce46a3-0909-4d15-b166-fa480a7e6164-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.876920 master-0 kubenswrapper[13984]: I0312 12:47:17.876851 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:17.902316 master-0 kubenswrapper[13984]: I0312 12:47:17.902156 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:17.991190 master-0 kubenswrapper[13984]: I0312 12:47:17.990666 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data" (OuterVolumeSpecName: "config-data") pod "d1ce46a3-0909-4d15-b166-fa480a7e6164" (UID: "d1ce46a3-0909-4d15-b166-fa480a7e6164"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:18.005095 master-0 kubenswrapper[13984]: I0312 12:47:18.005037 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d1ce46a3-0909-4d15-b166-fa480a7e6164-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:18.327056 master-0 kubenswrapper[13984]: I0312 12:47:18.326988 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5bc96c7dbf-x46lx" event={"ID":"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4","Type":"ContainerStarted","Data":"bab9014fa880ab4cccdda5c16ae732f0f15a3eabf6fbb7171c4fed44538f03e5"} Mar 12 12:47:18.343732 master-0 kubenswrapper[13984]: I0312 12:47:18.328354 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"c25b182876df3f68e947e443c8063f79f37dab5a0a8f3a8223e02e7a948a706e"} Mar 12 12:47:18.343732 master-0 kubenswrapper[13984]: I0312 12:47:18.331802 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.343732 master-0 kubenswrapper[13984]: I0312 12:47:18.332858 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"d1ce46a3-0909-4d15-b166-fa480a7e6164","Type":"ContainerDied","Data":"598407bbaa32270d50c3f82a3df29944ef2f3e45c11a3a169a35914270f9f2cf"} Mar 12 12:47:18.343732 master-0 kubenswrapper[13984]: I0312 12:47:18.332909 13984 scope.go:117] "RemoveContainer" containerID="e90f7d48875325837b809728e5f30a92e9a080085dac7bd5acce43c5d815fc5e" Mar 12 12:47:18.343732 master-0 kubenswrapper[13984]: I0312 12:47:18.341894 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"e688c798-286e-49cf-93f7-9c95e822b759","Type":"ContainerStarted","Data":"23e1c84948fb50d1e8cf8ddc916d5c28a061769432e02980245f26fa99fa52c8"} Mar 12 12:47:18.401533 master-0 kubenswrapper[13984]: I0312 12:47:18.400262 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6bdc8876c8-5m8cl"] Mar 12 12:47:18.412686 master-0 kubenswrapper[13984]: I0312 12:47:18.410099 13984 scope.go:117] "RemoveContainer" containerID="eeb54250e63c3840e474dc512537984d482244e8df6768125caa1d48b8cd85a3" Mar 12 12:47:18.473296 master-0 kubenswrapper[13984]: I0312 12:47:18.470793 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:18.495538 master-0 kubenswrapper[13984]: I0312 12:47:18.494301 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: I0312 12:47:18.505376 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: E0312 12:47:18.506760 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="probe" Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: I0312 12:47:18.506792 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="probe" Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: E0312 12:47:18.506844 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="cinder-backup" Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: I0312 12:47:18.506852 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="cinder-backup" Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: I0312 12:47:18.507161 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="cinder-backup" Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: I0312 12:47:18.507190 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" containerName="probe" Mar 12 12:47:18.509516 master-0 kubenswrapper[13984]: I0312 12:47:18.509228 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.520112 master-0 kubenswrapper[13984]: I0312 12:47:18.512237 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-8c9c7-backup-config-data" Mar 12 12:47:18.520112 master-0 kubenswrapper[13984]: I0312 12:47:18.515117 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:18.620807 master-0 kubenswrapper[13984]: I0312 12:47:18.620773 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-iscsi\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.620935 master-0 kubenswrapper[13984]: I0312 12:47:18.620821 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-run\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.620935 master-0 kubenswrapper[13984]: I0312 12:47:18.620854 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-machine-id\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.620935 master-0 kubenswrapper[13984]: I0312 12:47:18.620875 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-scripts\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.620935 master-0 kubenswrapper[13984]: I0312 12:47:18.620898 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-config-data-custom\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621096 master-0 kubenswrapper[13984]: I0312 12:47:18.620945 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-lib-modules\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621096 master-0 kubenswrapper[13984]: I0312 12:47:18.620963 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-config-data\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621096 master-0 kubenswrapper[13984]: I0312 12:47:18.620981 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-nvme\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621096 master-0 kubenswrapper[13984]: I0312 12:47:18.621010 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-lib-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621096 master-0 kubenswrapper[13984]: I0312 12:47:18.621060 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm5fk\" (UniqueName: \"kubernetes.io/projected/4648a719-c73b-41e4-9409-ffc011468cf1-kube-api-access-hm5fk\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621096 master-0 kubenswrapper[13984]: I0312 12:47:18.621080 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-locks-brick\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621289 master-0 kubenswrapper[13984]: I0312 12:47:18.621105 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-locks-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621289 master-0 kubenswrapper[13984]: I0312 12:47:18.621127 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-dev\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621289 master-0 kubenswrapper[13984]: I0312 12:47:18.621162 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-sys\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.621289 master-0 kubenswrapper[13984]: I0312 12:47:18.621176 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-combined-ca-bundle\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723198 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-dev\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723309 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-combined-ca-bundle\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723338 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-sys\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723361 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-iscsi\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723394 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-run\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723439 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-machine-id\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723466 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-scripts\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723522 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-config-data-custom\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723585 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-lib-modules\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723602 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-config-data\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723620 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-nvme\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723663 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-lib-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723732 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm5fk\" (UniqueName: \"kubernetes.io/projected/4648a719-c73b-41e4-9409-ffc011468cf1-kube-api-access-hm5fk\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723757 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-locks-brick\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.723868 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-locks-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.724023 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-locks-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.724071 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-dev\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.727558 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-machine-id\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.727705 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-lib-cinder\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.727743 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-sys\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.727769 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-iscsi\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.727804 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-etc-nvme\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.730169 master-0 kubenswrapper[13984]: I0312 12:47:18.729513 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-lib-modules\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.734404 master-0 kubenswrapper[13984]: I0312 12:47:18.732093 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-combined-ca-bundle\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.734404 master-0 kubenswrapper[13984]: I0312 12:47:18.732663 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-run\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.734404 master-0 kubenswrapper[13984]: I0312 12:47:18.732746 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4648a719-c73b-41e4-9409-ffc011468cf1-var-locks-brick\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.743778 master-0 kubenswrapper[13984]: I0312 12:47:18.742110 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-config-data\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.751139 master-0 kubenswrapper[13984]: I0312 12:47:18.751063 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-config-data-custom\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.761407 master-0 kubenswrapper[13984]: I0312 12:47:18.760060 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4648a719-c73b-41e4-9409-ffc011468cf1-scripts\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:18.766579 master-0 kubenswrapper[13984]: I0312 12:47:18.765124 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm5fk\" (UniqueName: \"kubernetes.io/projected/4648a719-c73b-41e4-9409-ffc011468cf1-kube-api-access-hm5fk\") pod \"cinder-8c9c7-backup-0\" (UID: \"4648a719-c73b-41e4-9409-ffc011468cf1\") " pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:19.075173 master-0 kubenswrapper[13984]: I0312 12:47:19.073647 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:19.262264 master-0 kubenswrapper[13984]: I0312 12:47:19.262148 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7d857f7b77-g6ptr" Mar 12 12:47:19.443541 master-0 kubenswrapper[13984]: I0312 12:47:19.440516 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerStarted","Data":"669d7d3772fdabff2959160746730dbe47dda31d253ae2d80e2c3f2584a6c463"} Mar 12 12:47:19.443541 master-0 kubenswrapper[13984]: I0312 12:47:19.440573 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerStarted","Data":"4f36b1500967347631bfc950041c38dee15068bcaed2efe17938b6bc7aa4d799"} Mar 12 12:47:19.443541 master-0 kubenswrapper[13984]: I0312 12:47:19.442010 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:19.453692 master-0 kubenswrapper[13984]: I0312 12:47:19.453597 13984 generic.go:334] "Generic (PLEG): container finished" podID="431f12da-d4a8-4748-87d1-4ffdd0b1ecc4" containerID="10949bd2769ef45f395935e8c3df0ce919a357645b3d069b7daaa2cae468eff0" exitCode=0 Mar 12 12:47:19.453692 master-0 kubenswrapper[13984]: I0312 12:47:19.453670 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5bc96c7dbf-x46lx" event={"ID":"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4","Type":"ContainerDied","Data":"10949bd2769ef45f395935e8c3df0ce919a357645b3d069b7daaa2cae468eff0"} Mar 12 12:47:19.575567 master-0 kubenswrapper[13984]: I0312 12:47:19.561874 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-59d99f4857-zbcnf" podStartSLOduration=6.228544122 podStartE2EDuration="10.561849557s" podCreationTimestamp="2026-03-12 12:47:09 +0000 UTC" firstStartedPulling="2026-03-12 12:47:11.3323045 +0000 UTC m=+1363.530319992" lastFinishedPulling="2026-03-12 12:47:15.665609935 +0000 UTC m=+1367.863625427" observedRunningTime="2026-03-12 12:47:19.471166592 +0000 UTC m=+1371.669182084" watchObservedRunningTime="2026-03-12 12:47:19.561849557 +0000 UTC m=+1371.759865049" Mar 12 12:47:19.598447 master-0 kubenswrapper[13984]: I0312 12:47:19.597928 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" event={"ID":"e688c798-286e-49cf-93f7-9c95e822b759","Type":"ContainerStarted","Data":"b4d3fb9d655aaa3a0707ec4afae4dce3402572dbd375194ca75b5fbfde8c3652"} Mar 12 12:47:19.618246 master-0 kubenswrapper[13984]: I0312 12:47:19.612417 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdc8876c8-5m8cl" event={"ID":"fafd3eff-630a-4ba3-b901-41d3de58bc54","Type":"ContainerStarted","Data":"03085848c59f60efd641348810132a7b58cccdce13b4ac23482c2a9cada49cc3"} Mar 12 12:47:19.618246 master-0 kubenswrapper[13984]: I0312 12:47:19.612519 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdc8876c8-5m8cl" event={"ID":"fafd3eff-630a-4ba3-b901-41d3de58bc54","Type":"ContainerStarted","Data":"e60d7ed5def50dbe3f35efe59d5a856a14f481be2a25c8471d7ac69d49ca95ad"} Mar 12 12:47:19.618246 master-0 kubenswrapper[13984]: I0312 12:47:19.617735 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"bffc1e39-45d1-496f-b4af-8083ec87dabf","Type":"ContainerStarted","Data":"36e594f1311cf2b8dcf8bab5a0cd7dbf79d0b6a1969336cb196df780d6467976"} Mar 12 12:47:19.704509 master-0 kubenswrapper[13984]: I0312 12:47:19.701028 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" podStartSLOduration=6.701009557 podStartE2EDuration="6.701009557s" podCreationTimestamp="2026-03-12 12:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:19.673030969 +0000 UTC m=+1371.871046481" watchObservedRunningTime="2026-03-12 12:47:19.701009557 +0000 UTC m=+1371.899025049" Mar 12 12:47:19.752394 master-0 kubenswrapper[13984]: I0312 12:47:19.749976 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:19.951032 master-0 kubenswrapper[13984]: I0312 12:47:19.950962 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:47:19.960809 master-0 kubenswrapper[13984]: I0312 12:47:19.960751 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-8c9c7-backup-0"] Mar 12 12:47:20.050460 master-0 kubenswrapper[13984]: I0312 12:47:20.043590 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1ce46a3-0909-4d15-b166-fa480a7e6164" path="/var/lib/kubelet/pods/d1ce46a3-0909-4d15-b166-fa480a7e6164/volumes" Mar 12 12:47:20.050460 master-0 kubenswrapper[13984]: I0312 12:47:20.048092 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-678f7c7469-hf7k6"] Mar 12 12:47:20.050460 master-0 kubenswrapper[13984]: I0312 12:47:20.048382 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerName="dnsmasq-dns" containerID="cri-o://1f8686b1922e0425ed1102482e059e392e43f7ba55fed20380712f74fceb3a36" gracePeriod=10 Mar 12 12:47:20.090089 master-0 kubenswrapper[13984]: I0312 12:47:20.089594 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 12 12:47:20.096419 master-0 kubenswrapper[13984]: I0312 12:47:20.095804 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 12:47:20.109442 master-0 kubenswrapper[13984]: I0312 12:47:20.109396 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 12 12:47:20.109876 master-0 kubenswrapper[13984]: I0312 12:47:20.109396 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 12 12:47:20.132929 master-0 kubenswrapper[13984]: I0312 12:47:20.132341 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 12:47:20.176921 master-0 kubenswrapper[13984]: I0312 12:47:20.165074 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzggg\" (UniqueName: \"kubernetes.io/projected/1f0d75f9-805b-4886-9bff-8544337a2fd1-kube-api-access-tzggg\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.176921 master-0 kubenswrapper[13984]: I0312 12:47:20.165119 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0d75f9-805b-4886-9bff-8544337a2fd1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.176921 master-0 kubenswrapper[13984]: I0312 12:47:20.165145 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1f0d75f9-805b-4886-9bff-8544337a2fd1-openstack-config-secret\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.176921 master-0 kubenswrapper[13984]: I0312 12:47:20.165350 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1f0d75f9-805b-4886-9bff-8544337a2fd1-openstack-config\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.277929 master-0 kubenswrapper[13984]: I0312 12:47:20.277874 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1f0d75f9-805b-4886-9bff-8544337a2fd1-openstack-config\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.278148 master-0 kubenswrapper[13984]: I0312 12:47:20.277975 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzggg\" (UniqueName: \"kubernetes.io/projected/1f0d75f9-805b-4886-9bff-8544337a2fd1-kube-api-access-tzggg\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.278148 master-0 kubenswrapper[13984]: I0312 12:47:20.278002 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0d75f9-805b-4886-9bff-8544337a2fd1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.278148 master-0 kubenswrapper[13984]: I0312 12:47:20.278031 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1f0d75f9-805b-4886-9bff-8544337a2fd1-openstack-config-secret\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.280272 master-0 kubenswrapper[13984]: I0312 12:47:20.279375 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1f0d75f9-805b-4886-9bff-8544337a2fd1-openstack-config\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.291032 master-0 kubenswrapper[13984]: I0312 12:47:20.290954 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1f0d75f9-805b-4886-9bff-8544337a2fd1-openstack-config-secret\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.310221 master-0 kubenswrapper[13984]: I0312 12:47:20.310151 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-8c9c7-api-0" Mar 12 12:47:20.315698 master-0 kubenswrapper[13984]: I0312 12:47:20.314910 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f0d75f9-805b-4886-9bff-8544337a2fd1-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.315806 master-0 kubenswrapper[13984]: I0312 12:47:20.315779 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzggg\" (UniqueName: \"kubernetes.io/projected/1f0d75f9-805b-4886-9bff-8544337a2fd1-kube-api-access-tzggg\") pod \"openstackclient\" (UID: \"1f0d75f9-805b-4886-9bff-8544337a2fd1\") " pod="openstack/openstackclient" Mar 12 12:47:20.458209 master-0 kubenswrapper[13984]: I0312 12:47:20.458130 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 12 12:47:20.673142 master-0 kubenswrapper[13984]: I0312 12:47:20.673050 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"4648a719-c73b-41e4-9409-ffc011468cf1","Type":"ContainerStarted","Data":"5972f9dee460d6c8fd263fc78b938b53da46d8b2f1c47dd6e0da441b6db6888f"} Mar 12 12:47:20.691844 master-0 kubenswrapper[13984]: E0312 12:47:20.691750 13984 log.go:32] "ExecSync cmd from runtime service failed" err=< Mar 12 12:47:20.691844 master-0 kubenswrapper[13984]: rpc error: code = Unknown desc = command error: setns `mnt`: Bad file descriptor Mar 12 12:47:20.691844 master-0 kubenswrapper[13984]: fail startup Mar 12 12:47:20.691844 master-0 kubenswrapper[13984]: , stdout: , stderr: , exit code -1 Mar 12 12:47:20.691844 master-0 kubenswrapper[13984]: > containerID="9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f" cmd=["/bin/true"] Mar 12 12:47:20.710416 master-0 kubenswrapper[13984]: E0312 12:47:20.701067 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f is running failed: container process not found" containerID="9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f" cmd=["/bin/true"] Mar 12 12:47:20.710416 master-0 kubenswrapper[13984]: I0312 12:47:20.707756 13984 generic.go:334] "Generic (PLEG): container finished" podID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerID="1f8686b1922e0425ed1102482e059e392e43f7ba55fed20380712f74fceb3a36" exitCode=0 Mar 12 12:47:20.710416 master-0 kubenswrapper[13984]: I0312 12:47:20.707814 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" event={"ID":"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118","Type":"ContainerDied","Data":"1f8686b1922e0425ed1102482e059e392e43f7ba55fed20380712f74fceb3a36"} Mar 12 12:47:20.710416 master-0 kubenswrapper[13984]: E0312 12:47:20.707853 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f is running failed: container process not found" containerID="9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f" cmd=["/bin/true"] Mar 12 12:47:20.710416 master-0 kubenswrapper[13984]: E0312 12:47:20.707877 13984 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-776949857-qhnzl" podUID="4cecbed0-f638-495f-a450-ddcb64f6cc30" containerName="ironic-neutron-agent" Mar 12 12:47:20.741207 master-0 kubenswrapper[13984]: I0312 12:47:20.732491 13984 generic.go:334] "Generic (PLEG): container finished" podID="64252643-1a6d-4283-a6be-bdb4adbac235" containerID="669d7d3772fdabff2959160746730dbe47dda31d253ae2d80e2c3f2584a6c463" exitCode=1 Mar 12 12:47:20.741207 master-0 kubenswrapper[13984]: I0312 12:47:20.734601 13984 scope.go:117] "RemoveContainer" containerID="669d7d3772fdabff2959160746730dbe47dda31d253ae2d80e2c3f2584a6c463" Mar 12 12:47:20.741207 master-0 kubenswrapper[13984]: I0312 12:47:20.735063 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerDied","Data":"669d7d3772fdabff2959160746730dbe47dda31d253ae2d80e2c3f2584a6c463"} Mar 12 12:47:21.240444 master-0 kubenswrapper[13984]: I0312 12:47:21.240375 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 12 12:47:21.302526 master-0 kubenswrapper[13984]: W0312 12:47:21.299337 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f0d75f9_805b_4886_9bff_8544337a2fd1.slice/crio-3423ea327c7bc6b93f3e71e3a126550b622cc544d0f9415fff1ceaf5da478dbc WatchSource:0}: Error finding container 3423ea327c7bc6b93f3e71e3a126550b622cc544d0f9415fff1ceaf5da478dbc: Status 404 returned error can't find the container with id 3423ea327c7bc6b93f3e71e3a126550b622cc544d0f9415fff1ceaf5da478dbc Mar 12 12:47:21.671540 master-0 kubenswrapper[13984]: I0312 12:47:21.667122 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:47:21.772505 master-0 kubenswrapper[13984]: I0312 12:47:21.761267 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-config\") pod \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " Mar 12 12:47:21.772505 master-0 kubenswrapper[13984]: I0312 12:47:21.761374 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-sb\") pod \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " Mar 12 12:47:21.772505 master-0 kubenswrapper[13984]: I0312 12:47:21.761402 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-swift-storage-0\") pod \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " Mar 12 12:47:21.772505 master-0 kubenswrapper[13984]: I0312 12:47:21.761532 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49hq4\" (UniqueName: \"kubernetes.io/projected/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-kube-api-access-49hq4\") pod \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " Mar 12 12:47:21.772505 master-0 kubenswrapper[13984]: I0312 12:47:21.761651 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-nb\") pod \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " Mar 12 12:47:21.772505 master-0 kubenswrapper[13984]: I0312 12:47:21.761815 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-svc\") pod \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\" (UID: \"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118\") " Mar 12 12:47:21.803124 master-0 kubenswrapper[13984]: I0312 12:47:21.803042 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-kube-api-access-49hq4" (OuterVolumeSpecName: "kube-api-access-49hq4") pod "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" (UID: "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118"). InnerVolumeSpecName "kube-api-access-49hq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:21.816559 master-0 kubenswrapper[13984]: I0312 12:47:21.815983 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6bdc8876c8-5m8cl" event={"ID":"fafd3eff-630a-4ba3-b901-41d3de58bc54","Type":"ContainerStarted","Data":"c9540686f1b75d32e17a197a978383e4efc6fccff0789afdb939f1a80fe628f4"} Mar 12 12:47:21.816559 master-0 kubenswrapper[13984]: I0312 12:47:21.816083 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:21.816559 master-0 kubenswrapper[13984]: I0312 12:47:21.816108 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:21.832818 master-0 kubenswrapper[13984]: I0312 12:47:21.831861 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-scheduler-0" event={"ID":"bffc1e39-45d1-496f-b4af-8083ec87dabf","Type":"ContainerStarted","Data":"a051297344fb87bf55bd3c3eddd2a4273e0b5f2eb9947d0cd8a41fbee490a146"} Mar 12 12:47:21.834461 master-0 kubenswrapper[13984]: I0312 12:47:21.834417 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5bc96c7dbf-x46lx" event={"ID":"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4","Type":"ContainerStarted","Data":"e0fa54b6f5b2dbfa3c3904f5a178e4f47dabc75c7e6731025d3142b68787598a"} Mar 12 12:47:21.852577 master-0 kubenswrapper[13984]: I0312 12:47:21.848652 13984 generic.go:334] "Generic (PLEG): container finished" podID="43e6fb75-b813-4074-8029-d6817b1bb9e2" containerID="c25b182876df3f68e947e443c8063f79f37dab5a0a8f3a8223e02e7a948a706e" exitCode=0 Mar 12 12:47:21.852577 master-0 kubenswrapper[13984]: I0312 12:47:21.848741 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerDied","Data":"c25b182876df3f68e947e443c8063f79f37dab5a0a8f3a8223e02e7a948a706e"} Mar 12 12:47:21.860164 master-0 kubenswrapper[13984]: I0312 12:47:21.858882 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6bdc8876c8-5m8cl" podStartSLOduration=4.858853486 podStartE2EDuration="4.858853486s" podCreationTimestamp="2026-03-12 12:47:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:21.84798509 +0000 UTC m=+1374.046000582" watchObservedRunningTime="2026-03-12 12:47:21.858853486 +0000 UTC m=+1374.056868978" Mar 12 12:47:21.865619 master-0 kubenswrapper[13984]: I0312 12:47:21.865100 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49hq4\" (UniqueName: \"kubernetes.io/projected/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-kube-api-access-49hq4\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:21.874589 master-0 kubenswrapper[13984]: I0312 12:47:21.874534 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" event={"ID":"99a4ff2d-fa32-4c3c-ae39-8a6f745c8118","Type":"ContainerDied","Data":"24dc8393d60a4f4241937f837b4789dd0eb63473193be9f70275159ebbce7e86"} Mar 12 12:47:21.875974 master-0 kubenswrapper[13984]: I0312 12:47:21.874976 13984 scope.go:117] "RemoveContainer" containerID="1f8686b1922e0425ed1102482e059e392e43f7ba55fed20380712f74fceb3a36" Mar 12 12:47:21.875974 master-0 kubenswrapper[13984]: I0312 12:47:21.874938 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-678f7c7469-hf7k6" Mar 12 12:47:21.879571 master-0 kubenswrapper[13984]: I0312 12:47:21.879495 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-scheduler-0" podStartSLOduration=8.879462336 podStartE2EDuration="8.879462336s" podCreationTimestamp="2026-03-12 12:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:21.867894571 +0000 UTC m=+1374.065910063" watchObservedRunningTime="2026-03-12 12:47:21.879462336 +0000 UTC m=+1374.077477828" Mar 12 12:47:21.884201 master-0 kubenswrapper[13984]: I0312 12:47:21.884159 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1f0d75f9-805b-4886-9bff-8544337a2fd1","Type":"ContainerStarted","Data":"3423ea327c7bc6b93f3e71e3a126550b622cc544d0f9415fff1ceaf5da478dbc"} Mar 12 12:47:21.885742 master-0 kubenswrapper[13984]: I0312 12:47:21.885704 13984 generic.go:334] "Generic (PLEG): container finished" podID="4cecbed0-f638-495f-a450-ddcb64f6cc30" containerID="9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f" exitCode=1 Mar 12 12:47:21.885742 master-0 kubenswrapper[13984]: I0312 12:47:21.885739 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-776949857-qhnzl" event={"ID":"4cecbed0-f638-495f-a450-ddcb64f6cc30","Type":"ContainerDied","Data":"9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f"} Mar 12 12:47:21.886387 master-0 kubenswrapper[13984]: I0312 12:47:21.886332 13984 scope.go:117] "RemoveContainer" containerID="9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f" Mar 12 12:47:22.258294 master-0 kubenswrapper[13984]: I0312 12:47:22.258244 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" (UID: "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:22.295557 master-0 kubenswrapper[13984]: I0312 12:47:22.290086 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" (UID: "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:22.295557 master-0 kubenswrapper[13984]: I0312 12:47:22.295531 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:22.295828 master-0 kubenswrapper[13984]: I0312 12:47:22.295588 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:22.336406 master-0 kubenswrapper[13984]: I0312 12:47:22.331213 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" (UID: "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:22.354130 master-0 kubenswrapper[13984]: I0312 12:47:22.351975 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-config" (OuterVolumeSpecName: "config") pod "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" (UID: "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:22.360953 master-0 kubenswrapper[13984]: I0312 12:47:22.358217 13984 scope.go:117] "RemoveContainer" containerID="9cf24830755bb46d1234bf2d28fb38969322c5ce5bebc5d9c66bcf6f05b46a40" Mar 12 12:47:22.379362 master-0 kubenswrapper[13984]: I0312 12:47:22.376576 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" (UID: "99a4ff2d-fa32-4c3c-ae39-8a6f745c8118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:22.401716 master-0 kubenswrapper[13984]: I0312 12:47:22.398010 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:22.401716 master-0 kubenswrapper[13984]: I0312 12:47:22.398056 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:22.401716 master-0 kubenswrapper[13984]: I0312 12:47:22.398067 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:22.704399 master-0 kubenswrapper[13984]: I0312 12:47:22.704345 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-678f7c7469-hf7k6"] Mar 12 12:47:22.726574 master-0 kubenswrapper[13984]: I0312 12:47:22.726535 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-678f7c7469-hf7k6"] Mar 12 12:47:22.926560 master-0 kubenswrapper[13984]: I0312 12:47:22.924928 13984 generic.go:334] "Generic (PLEG): container finished" podID="64252643-1a6d-4283-a6be-bdb4adbac235" containerID="91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae" exitCode=1 Mar 12 12:47:22.926560 master-0 kubenswrapper[13984]: I0312 12:47:22.925008 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerDied","Data":"91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae"} Mar 12 12:47:22.926560 master-0 kubenswrapper[13984]: I0312 12:47:22.925043 13984 scope.go:117] "RemoveContainer" containerID="669d7d3772fdabff2959160746730dbe47dda31d253ae2d80e2c3f2584a6c463" Mar 12 12:47:22.926560 master-0 kubenswrapper[13984]: I0312 12:47:22.925923 13984 scope.go:117] "RemoveContainer" containerID="91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae" Mar 12 12:47:22.926560 master-0 kubenswrapper[13984]: E0312 12:47:22.926203 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-59d99f4857-zbcnf_openstack(64252643-1a6d-4283-a6be-bdb4adbac235)\"" pod="openstack/ironic-59d99f4857-zbcnf" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" Mar 12 12:47:22.939832 master-0 kubenswrapper[13984]: I0312 12:47:22.939787 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-5bc96c7dbf-x46lx" event={"ID":"431f12da-d4a8-4748-87d1-4ffdd0b1ecc4","Type":"ContainerStarted","Data":"852ff73c0b9acbe932ad40000549bc5317025d2efb93d21c9b8e505a7eee0e83"} Mar 12 12:47:22.942027 master-0 kubenswrapper[13984]: I0312 12:47:22.941968 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:22.953449 master-0 kubenswrapper[13984]: I0312 12:47:22.953393 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"4648a719-c73b-41e4-9409-ffc011468cf1","Type":"ContainerStarted","Data":"5253cd36895742d4b877b94b16730b3a234a407e92bc0a6904638ca856ca4974"} Mar 12 12:47:22.953904 master-0 kubenswrapper[13984]: I0312 12:47:22.953887 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-8c9c7-backup-0" event={"ID":"4648a719-c73b-41e4-9409-ffc011468cf1","Type":"ContainerStarted","Data":"551bf07e5b1fc1d7f7f4f0932ef80d7a42df01ab76a8f10b6a1cbd93ef3065cb"} Mar 12 12:47:22.958353 master-0 kubenswrapper[13984]: I0312 12:47:22.958327 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-776949857-qhnzl" event={"ID":"4cecbed0-f638-495f-a450-ddcb64f6cc30","Type":"ContainerStarted","Data":"f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8"} Mar 12 12:47:22.959069 master-0 kubenswrapper[13984]: I0312 12:47:22.959054 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:23.016549 master-0 kubenswrapper[13984]: I0312 12:47:23.013264 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-8c9c7-backup-0" podStartSLOduration=5.013246438 podStartE2EDuration="5.013246438s" podCreationTimestamp="2026-03-12 12:47:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:23.009713038 +0000 UTC m=+1375.207728530" watchObservedRunningTime="2026-03-12 12:47:23.013246438 +0000 UTC m=+1375.211261930" Mar 12 12:47:23.092493 master-0 kubenswrapper[13984]: I0312 12:47:23.088909 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-5bc96c7dbf-x46lx" podStartSLOduration=10.088886219 podStartE2EDuration="10.088886219s" podCreationTimestamp="2026-03-12 12:47:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:23.035932927 +0000 UTC m=+1375.233948419" watchObservedRunningTime="2026-03-12 12:47:23.088886219 +0000 UTC m=+1375.286901711" Mar 12 12:47:23.698588 master-0 kubenswrapper[13984]: I0312 12:47:23.697875 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:23.942794 master-0 kubenswrapper[13984]: I0312 12:47:23.942039 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:24.006759 master-0 kubenswrapper[13984]: I0312 12:47:24.006630 13984 scope.go:117] "RemoveContainer" containerID="91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae" Mar 12 12:47:24.007055 master-0 kubenswrapper[13984]: E0312 12:47:24.007022 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-59d99f4857-zbcnf_openstack(64252643-1a6d-4283-a6be-bdb4adbac235)\"" pod="openstack/ironic-59d99f4857-zbcnf" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" Mar 12 12:47:24.018914 master-0 kubenswrapper[13984]: I0312 12:47:24.018866 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" path="/var/lib/kubelet/pods/99a4ff2d-fa32-4c3c-ae39-8a6f745c8118/volumes" Mar 12 12:47:24.032600 master-0 kubenswrapper[13984]: I0312 12:47:24.032559 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-8c9c7-scheduler-0" Mar 12 12:47:24.075685 master-0 kubenswrapper[13984]: I0312 12:47:24.075630 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:24.298412 master-0 kubenswrapper[13984]: I0312 12:47:24.298295 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-6crfp"] Mar 12 12:47:24.300587 master-0 kubenswrapper[13984]: E0312 12:47:24.299414 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerName="dnsmasq-dns" Mar 12 12:47:24.300587 master-0 kubenswrapper[13984]: I0312 12:47:24.299526 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerName="dnsmasq-dns" Mar 12 12:47:24.300587 master-0 kubenswrapper[13984]: E0312 12:47:24.299554 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerName="init" Mar 12 12:47:24.300587 master-0 kubenswrapper[13984]: I0312 12:47:24.299566 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerName="init" Mar 12 12:47:24.303991 master-0 kubenswrapper[13984]: I0312 12:47:24.300902 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="99a4ff2d-fa32-4c3c-ae39-8a6f745c8118" containerName="dnsmasq-dns" Mar 12 12:47:24.304343 master-0 kubenswrapper[13984]: I0312 12:47:24.304309 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.308660 master-0 kubenswrapper[13984]: I0312 12:47:24.308588 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 12 12:47:24.313540 master-0 kubenswrapper[13984]: I0312 12:47:24.309109 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 12 12:47:24.377509 master-0 kubenswrapper[13984]: I0312 12:47:24.376592 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-6crfp"] Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.388881 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kq7f\" (UniqueName: \"kubernetes.io/projected/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-kube-api-access-6kq7f\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.388966 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.389078 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-combined-ca-bundle\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.389150 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-etc-podinfo\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.389192 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-scripts\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.389254 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-config\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.390912 master-0 kubenswrapper[13984]: I0312 12:47:24.389373 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.472506 master-0 kubenswrapper[13984]: I0312 12:47:24.471938 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-8c9c7-volume-lvm-iscsi-0" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492012 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-combined-ca-bundle\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492128 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-etc-podinfo\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492173 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-scripts\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492240 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-config\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492266 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492302 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kq7f\" (UniqueName: \"kubernetes.io/projected/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-kube-api-access-6kq7f\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492335 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.493214 master-0 kubenswrapper[13984]: I0312 12:47:24.492786 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.501009 master-0 kubenswrapper[13984]: I0312 12:47:24.495829 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.514975 master-0 kubenswrapper[13984]: I0312 12:47:24.504817 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-config\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.514975 master-0 kubenswrapper[13984]: I0312 12:47:24.505232 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-etc-podinfo\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.514975 master-0 kubenswrapper[13984]: I0312 12:47:24.505777 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-scripts\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.515916 master-0 kubenswrapper[13984]: I0312 12:47:24.515851 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-combined-ca-bundle\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.520497 master-0 kubenswrapper[13984]: I0312 12:47:24.517173 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kq7f\" (UniqueName: \"kubernetes.io/projected/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-kube-api-access-6kq7f\") pod \"ironic-inspector-db-sync-6crfp\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:24.649431 master-0 kubenswrapper[13984]: I0312 12:47:24.649314 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:25.181876 master-0 kubenswrapper[13984]: I0312 12:47:25.178261 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:25.206117 master-0 kubenswrapper[13984]: I0312 12:47:25.205347 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-bcc965b68-5qb2v"] Mar 12 12:47:25.211466 master-0 kubenswrapper[13984]: I0312 12:47:25.211238 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.229568 master-0 kubenswrapper[13984]: I0312 12:47:25.213379 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 12 12:47:25.229568 master-0 kubenswrapper[13984]: I0312 12:47:25.214365 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 12 12:47:25.229568 master-0 kubenswrapper[13984]: I0312 12:47:25.215411 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 12 12:47:25.277072 master-0 kubenswrapper[13984]: I0312 12:47:25.269898 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bcc965b68-5qb2v"] Mar 12 12:47:25.305001 master-0 kubenswrapper[13984]: I0312 12:47:25.302238 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-6crfp"] Mar 12 12:47:25.319771 master-0 kubenswrapper[13984]: W0312 12:47:25.319607 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bd44f3d_5677_4edd_8e29_8f010f3bfed1.slice/crio-9a442509f9d0154f81c97d28bae98238712305d8bce7fc44cd03be5dcf902625 WatchSource:0}: Error finding container 9a442509f9d0154f81c97d28bae98238712305d8bce7fc44cd03be5dcf902625: Status 404 returned error can't find the container with id 9a442509f9d0154f81c97d28bae98238712305d8bce7fc44cd03be5dcf902625 Mar 12 12:47:25.326312 master-0 kubenswrapper[13984]: I0312 12:47:25.326221 13984 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:25.326795 master-0 kubenswrapper[13984]: I0312 12:47:25.326771 13984 scope.go:117] "RemoveContainer" containerID="91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae" Mar 12 12:47:25.329687 master-0 kubenswrapper[13984]: E0312 12:47:25.329619 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-59d99f4857-zbcnf_openstack(64252643-1a6d-4283-a6be-bdb4adbac235)\"" pod="openstack/ironic-59d99f4857-zbcnf" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" Mar 12 12:47:25.330753 master-0 kubenswrapper[13984]: I0312 12:47:25.330077 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:25.351674 master-0 kubenswrapper[13984]: I0312 12:47:25.351527 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e02202f-67a8-46f0-a049-0f6b5d33358d-log-httpd\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.351674 master-0 kubenswrapper[13984]: I0312 12:47:25.351598 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9sqj\" (UniqueName: \"kubernetes.io/projected/0e02202f-67a8-46f0-a049-0f6b5d33358d-kube-api-access-n9sqj\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.351956 master-0 kubenswrapper[13984]: I0312 12:47:25.351918 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-combined-ca-bundle\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.352012 master-0 kubenswrapper[13984]: I0312 12:47:25.351966 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-internal-tls-certs\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.352089 master-0 kubenswrapper[13984]: I0312 12:47:25.352042 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-config-data\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.352170 master-0 kubenswrapper[13984]: I0312 12:47:25.352124 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-public-tls-certs\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.352266 master-0 kubenswrapper[13984]: I0312 12:47:25.352237 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0e02202f-67a8-46f0-a049-0f6b5d33358d-etc-swift\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.352313 master-0 kubenswrapper[13984]: I0312 12:47:25.352302 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e02202f-67a8-46f0-a049-0f6b5d33358d-run-httpd\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458522 master-0 kubenswrapper[13984]: I0312 12:47:25.458465 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-public-tls-certs\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458757 master-0 kubenswrapper[13984]: I0312 12:47:25.458563 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0e02202f-67a8-46f0-a049-0f6b5d33358d-etc-swift\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458757 master-0 kubenswrapper[13984]: I0312 12:47:25.458624 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e02202f-67a8-46f0-a049-0f6b5d33358d-run-httpd\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458757 master-0 kubenswrapper[13984]: I0312 12:47:25.458732 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e02202f-67a8-46f0-a049-0f6b5d33358d-log-httpd\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458860 master-0 kubenswrapper[13984]: I0312 12:47:25.458791 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9sqj\" (UniqueName: \"kubernetes.io/projected/0e02202f-67a8-46f0-a049-0f6b5d33358d-kube-api-access-n9sqj\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458907 master-0 kubenswrapper[13984]: I0312 12:47:25.458860 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-combined-ca-bundle\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.458907 master-0 kubenswrapper[13984]: I0312 12:47:25.458895 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-internal-tls-certs\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.459586 master-0 kubenswrapper[13984]: I0312 12:47:25.459565 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e02202f-67a8-46f0-a049-0f6b5d33358d-log-httpd\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.460445 master-0 kubenswrapper[13984]: I0312 12:47:25.460409 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-config-data\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.461707 master-0 kubenswrapper[13984]: I0312 12:47:25.460955 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0e02202f-67a8-46f0-a049-0f6b5d33358d-run-httpd\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.463619 master-0 kubenswrapper[13984]: I0312 12:47:25.462855 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-public-tls-certs\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.469616 master-0 kubenswrapper[13984]: I0312 12:47:25.467662 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-combined-ca-bundle\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.469616 master-0 kubenswrapper[13984]: I0312 12:47:25.467960 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-config-data\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.469616 master-0 kubenswrapper[13984]: I0312 12:47:25.469005 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0e02202f-67a8-46f0-a049-0f6b5d33358d-internal-tls-certs\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.469616 master-0 kubenswrapper[13984]: I0312 12:47:25.469103 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0e02202f-67a8-46f0-a049-0f6b5d33358d-etc-swift\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.479409 master-0 kubenswrapper[13984]: I0312 12:47:25.479365 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9sqj\" (UniqueName: \"kubernetes.io/projected/0e02202f-67a8-46f0-a049-0f6b5d33358d-kube-api-access-n9sqj\") pod \"swift-proxy-bcc965b68-5qb2v\" (UID: \"0e02202f-67a8-46f0-a049-0f6b5d33358d\") " pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:25.585502 master-0 kubenswrapper[13984]: I0312 12:47:25.584284 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:26.140449 master-0 kubenswrapper[13984]: I0312 12:47:26.136781 13984 scope.go:117] "RemoveContainer" containerID="91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae" Mar 12 12:47:26.140449 master-0 kubenswrapper[13984]: E0312 12:47:26.138612 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-59d99f4857-zbcnf_openstack(64252643-1a6d-4283-a6be-bdb4adbac235)\"" pod="openstack/ironic-59d99f4857-zbcnf" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" Mar 12 12:47:26.159183 master-0 kubenswrapper[13984]: I0312 12:47:26.159124 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6crfp" event={"ID":"5bd44f3d-5677-4edd-8e29-8f010f3bfed1","Type":"ContainerStarted","Data":"9a442509f9d0154f81c97d28bae98238712305d8bce7fc44cd03be5dcf902625"} Mar 12 12:47:26.159338 master-0 kubenswrapper[13984]: I0312 12:47:26.159211 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:47:26.303773 master-0 kubenswrapper[13984]: I0312 12:47:26.302231 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-bcc965b68-5qb2v"] Mar 12 12:47:27.061597 master-0 kubenswrapper[13984]: I0312 12:47:27.058147 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-5bc96c7dbf-x46lx" Mar 12 12:47:27.124112 master-0 kubenswrapper[13984]: I0312 12:47:27.123963 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bcc965b68-5qb2v" event={"ID":"0e02202f-67a8-46f0-a049-0f6b5d33358d","Type":"ContainerStarted","Data":"e67d84d072bd759069909faa1ef933055bfd5cdafae223b8dda29add91554d4b"} Mar 12 12:47:27.124112 master-0 kubenswrapper[13984]: I0312 12:47:27.124013 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bcc965b68-5qb2v" event={"ID":"0e02202f-67a8-46f0-a049-0f6b5d33358d","Type":"ContainerStarted","Data":"253ee08735d1ff3562b9f033a254ef1fede4cd3ea2214b4f9566d06b4e2d9f3e"} Mar 12 12:47:27.281198 master-0 kubenswrapper[13984]: I0312 12:47:27.280431 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-59d99f4857-zbcnf"] Mar 12 12:47:27.281198 master-0 kubenswrapper[13984]: I0312 12:47:27.280957 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-59d99f4857-zbcnf" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api-log" containerID="cri-o://4f36b1500967347631bfc950041c38dee15068bcaed2efe17938b6bc7aa4d799" gracePeriod=60 Mar 12 12:47:28.826035 master-0 kubenswrapper[13984]: I0312 12:47:28.821879 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6dc97885fc-g6g5j" Mar 12 12:47:28.922073 master-0 kubenswrapper[13984]: I0312 12:47:28.922023 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f777f5fbd-l96tl"] Mar 12 12:47:28.922384 master-0 kubenswrapper[13984]: I0312 12:47:28.922318 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f777f5fbd-l96tl" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-api" containerID="cri-o://53ea1660947e8524c6066f7f907dc7ab036b6030e398ed32f2b7a1b487336f0f" gracePeriod=30 Mar 12 12:47:28.922516 master-0 kubenswrapper[13984]: I0312 12:47:28.922406 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7f777f5fbd-l96tl" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-httpd" containerID="cri-o://2920d12c181fbe99698f1a8120e7ea3dabc0637c05f3659d64093bb6d59a8101" gracePeriod=30 Mar 12 12:47:29.355437 master-0 kubenswrapper[13984]: I0312 12:47:29.355347 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-8c9c7-backup-0" Mar 12 12:47:30.100569 master-0 kubenswrapper[13984]: E0312 12:47:30.100426 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" cmd=["/bin/true"] Mar 12 12:47:30.101387 master-0 kubenswrapper[13984]: E0312 12:47:30.100652 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" cmd=["/bin/true"] Mar 12 12:47:30.101387 master-0 kubenswrapper[13984]: E0312 12:47:30.100981 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" cmd=["/bin/true"] Mar 12 12:47:30.101387 master-0 kubenswrapper[13984]: E0312 12:47:30.101035 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" cmd=["/bin/true"] Mar 12 12:47:30.102021 master-0 kubenswrapper[13984]: E0312 12:47:30.101994 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" cmd=["/bin/true"] Mar 12 12:47:30.102097 master-0 kubenswrapper[13984]: E0312 12:47:30.102075 13984 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-776949857-qhnzl" podUID="4cecbed0-f638-495f-a450-ddcb64f6cc30" containerName="ironic-neutron-agent" Mar 12 12:47:30.102144 master-0 kubenswrapper[13984]: E0312 12:47:30.102127 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" cmd=["/bin/true"] Mar 12 12:47:30.102185 master-0 kubenswrapper[13984]: E0312 12:47:30.102143 13984 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8 is running failed: container process not found" probeType="Readiness" pod="openstack/ironic-neutron-agent-776949857-qhnzl" podUID="4cecbed0-f638-495f-a450-ddcb64f6cc30" containerName="ironic-neutron-agent" Mar 12 12:47:30.238816 master-0 kubenswrapper[13984]: I0312 12:47:30.238621 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-bcc965b68-5qb2v" event={"ID":"0e02202f-67a8-46f0-a049-0f6b5d33358d","Type":"ContainerStarted","Data":"1c0a56c87277ecf26b7524a752543c3f024281145e9003e486c98375efd5da15"} Mar 12 12:47:30.239588 master-0 kubenswrapper[13984]: I0312 12:47:30.239556 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:30.239813 master-0 kubenswrapper[13984]: I0312 12:47:30.239796 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:30.251583 master-0 kubenswrapper[13984]: I0312 12:47:30.251423 13984 generic.go:334] "Generic (PLEG): container finished" podID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerID="2920d12c181fbe99698f1a8120e7ea3dabc0637c05f3659d64093bb6d59a8101" exitCode=0 Mar 12 12:47:30.251583 master-0 kubenswrapper[13984]: I0312 12:47:30.251506 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f777f5fbd-l96tl" event={"ID":"91599efa-a74b-48bc-8553-58d2c33e1e75","Type":"ContainerDied","Data":"2920d12c181fbe99698f1a8120e7ea3dabc0637c05f3659d64093bb6d59a8101"} Mar 12 12:47:30.286210 master-0 kubenswrapper[13984]: I0312 12:47:30.286159 13984 generic.go:334] "Generic (PLEG): container finished" podID="64252643-1a6d-4283-a6be-bdb4adbac235" containerID="4f36b1500967347631bfc950041c38dee15068bcaed2efe17938b6bc7aa4d799" exitCode=143 Mar 12 12:47:30.286635 master-0 kubenswrapper[13984]: I0312 12:47:30.286236 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerDied","Data":"4f36b1500967347631bfc950041c38dee15068bcaed2efe17938b6bc7aa4d799"} Mar 12 12:47:30.293139 master-0 kubenswrapper[13984]: I0312 12:47:30.292252 13984 generic.go:334] "Generic (PLEG): container finished" podID="4cecbed0-f638-495f-a450-ddcb64f6cc30" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" exitCode=1 Mar 12 12:47:30.293139 master-0 kubenswrapper[13984]: I0312 12:47:30.292307 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-776949857-qhnzl" event={"ID":"4cecbed0-f638-495f-a450-ddcb64f6cc30","Type":"ContainerDied","Data":"f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8"} Mar 12 12:47:30.293139 master-0 kubenswrapper[13984]: I0312 12:47:30.292346 13984 scope.go:117] "RemoveContainer" containerID="9736494de2afa0c367c2947a2471e1bf0469edfbb6ece544e26f72ffcffc506f" Mar 12 12:47:30.293139 master-0 kubenswrapper[13984]: I0312 12:47:30.293102 13984 scope.go:117] "RemoveContainer" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" Mar 12 12:47:30.298232 master-0 kubenswrapper[13984]: E0312 12:47:30.293455 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-776949857-qhnzl_openstack(4cecbed0-f638-495f-a450-ddcb64f6cc30)\"" pod="openstack/ironic-neutron-agent-776949857-qhnzl" podUID="4cecbed0-f638-495f-a450-ddcb64f6cc30" Mar 12 12:47:30.416061 master-0 kubenswrapper[13984]: I0312 12:47:30.410687 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-bcc965b68-5qb2v" podStartSLOduration=5.410668997 podStartE2EDuration="5.410668997s" podCreationTimestamp="2026-03-12 12:47:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:30.409009888 +0000 UTC m=+1382.607025380" watchObservedRunningTime="2026-03-12 12:47:30.410668997 +0000 UTC m=+1382.608684489" Mar 12 12:47:31.306667 master-0 kubenswrapper[13984]: I0312 12:47:31.301412 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-qfsbw"] Mar 12 12:47:31.307282 master-0 kubenswrapper[13984]: I0312 12:47:31.306806 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.356902 master-0 kubenswrapper[13984]: I0312 12:47:31.356678 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qfsbw"] Mar 12 12:47:31.416442 master-0 kubenswrapper[13984]: I0312 12:47:31.411138 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/swift-proxy-bcc965b68-5qb2v" podUID="0e02202f-67a8-46f0-a049-0f6b5d33358d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 12 12:47:31.416442 master-0 kubenswrapper[13984]: I0312 12:47:31.414718 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-162c-account-create-update-jzxl9"] Mar 12 12:47:31.417455 master-0 kubenswrapper[13984]: I0312 12:47:31.417417 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.423766 master-0 kubenswrapper[13984]: I0312 12:47:31.419891 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4wxz\" (UniqueName: \"kubernetes.io/projected/89d18698-6635-4aba-a43e-eaf81cf72796-kube-api-access-b4wxz\") pod \"nova-api-db-create-qfsbw\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.423766 master-0 kubenswrapper[13984]: I0312 12:47:31.419986 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d18698-6635-4aba-a43e-eaf81cf72796-operator-scripts\") pod \"nova-api-db-create-qfsbw\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.424044 master-0 kubenswrapper[13984]: I0312 12:47:31.424007 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 12 12:47:31.443867 master-0 kubenswrapper[13984]: I0312 12:47:31.438918 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-162c-account-create-update-jzxl9"] Mar 12 12:47:31.457272 master-0 kubenswrapper[13984]: I0312 12:47:31.453398 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-tkj85"] Mar 12 12:47:31.457272 master-0 kubenswrapper[13984]: I0312 12:47:31.455646 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.494558 master-0 kubenswrapper[13984]: I0312 12:47:31.492343 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tkj85"] Mar 12 12:47:31.531847 master-0 kubenswrapper[13984]: I0312 12:47:31.531814 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4wxz\" (UniqueName: \"kubernetes.io/projected/89d18698-6635-4aba-a43e-eaf81cf72796-kube-api-access-b4wxz\") pod \"nova-api-db-create-qfsbw\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.534352 master-0 kubenswrapper[13984]: I0312 12:47:31.534311 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d18698-6635-4aba-a43e-eaf81cf72796-operator-scripts\") pod \"nova-api-db-create-qfsbw\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.541267 master-0 kubenswrapper[13984]: I0312 12:47:31.541122 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbeb3b5-acd9-4095-bc17-29c1b6a29960-operator-scripts\") pod \"nova-api-162c-account-create-update-jzxl9\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.542681 master-0 kubenswrapper[13984]: I0312 12:47:31.542655 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl7cr\" (UniqueName: \"kubernetes.io/projected/afbeb3b5-acd9-4095-bc17-29c1b6a29960-kube-api-access-zl7cr\") pod \"nova-api-162c-account-create-update-jzxl9\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.543297 master-0 kubenswrapper[13984]: I0312 12:47:31.543105 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d18698-6635-4aba-a43e-eaf81cf72796-operator-scripts\") pod \"nova-api-db-create-qfsbw\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.614568 master-0 kubenswrapper[13984]: I0312 12:47:31.614404 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4wxz\" (UniqueName: \"kubernetes.io/projected/89d18698-6635-4aba-a43e-eaf81cf72796-kube-api-access-b4wxz\") pod \"nova-api-db-create-qfsbw\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.618844 master-0 kubenswrapper[13984]: I0312 12:47:31.618799 13984 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-bcc965b68-5qb2v" podUID="0e02202f-67a8-46f0-a049-0f6b5d33358d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 12 12:47:31.629804 master-0 kubenswrapper[13984]: I0312 12:47:31.629704 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-nchvw"] Mar 12 12:47:31.632861 master-0 kubenswrapper[13984]: I0312 12:47:31.632521 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.645256 master-0 kubenswrapper[13984]: I0312 12:47:31.645029 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbeb3b5-acd9-4095-bc17-29c1b6a29960-operator-scripts\") pod \"nova-api-162c-account-create-update-jzxl9\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.646282 master-0 kubenswrapper[13984]: I0312 12:47:31.646170 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-642a-account-create-update-lwl6p"] Mar 12 12:47:31.650590 master-0 kubenswrapper[13984]: I0312 12:47:31.649790 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zl7cr\" (UniqueName: \"kubernetes.io/projected/afbeb3b5-acd9-4095-bc17-29c1b6a29960-kube-api-access-zl7cr\") pod \"nova-api-162c-account-create-update-jzxl9\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.650590 master-0 kubenswrapper[13984]: I0312 12:47:31.649889 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413ffd1a-0d7d-4ee6-8868-1828490695d6-operator-scripts\") pod \"nova-cell0-db-create-tkj85\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.650590 master-0 kubenswrapper[13984]: I0312 12:47:31.650161 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7br7d\" (UniqueName: \"kubernetes.io/projected/413ffd1a-0d7d-4ee6-8868-1828490695d6-kube-api-access-7br7d\") pod \"nova-cell0-db-create-tkj85\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.651273 master-0 kubenswrapper[13984]: I0312 12:47:31.650830 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbeb3b5-acd9-4095-bc17-29c1b6a29960-operator-scripts\") pod \"nova-api-162c-account-create-update-jzxl9\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.653788 master-0 kubenswrapper[13984]: I0312 12:47:31.653752 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.661930 master-0 kubenswrapper[13984]: I0312 12:47:31.661893 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:31.667436 master-0 kubenswrapper[13984]: I0312 12:47:31.667406 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 12 12:47:31.679020 master-0 kubenswrapper[13984]: I0312 12:47:31.677960 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nchvw"] Mar 12 12:47:31.684809 master-0 kubenswrapper[13984]: I0312 12:47:31.683436 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl7cr\" (UniqueName: \"kubernetes.io/projected/afbeb3b5-acd9-4095-bc17-29c1b6a29960-kube-api-access-zl7cr\") pod \"nova-api-162c-account-create-update-jzxl9\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.696285 master-0 kubenswrapper[13984]: I0312 12:47:31.693631 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-642a-account-create-update-lwl6p"] Mar 12 12:47:31.755889 master-0 kubenswrapper[13984]: I0312 12:47:31.755723 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cr76\" (UniqueName: \"kubernetes.io/projected/c00d2711-454c-4bc3-9db9-55fa8537ecad-kube-api-access-7cr76\") pod \"nova-cell0-642a-account-create-update-lwl6p\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.756192 master-0 kubenswrapper[13984]: I0312 12:47:31.755913 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rszrw\" (UniqueName: \"kubernetes.io/projected/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-kube-api-access-rszrw\") pod \"nova-cell1-db-create-nchvw\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.756192 master-0 kubenswrapper[13984]: I0312 12:47:31.756058 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00d2711-454c-4bc3-9db9-55fa8537ecad-operator-scripts\") pod \"nova-cell0-642a-account-create-update-lwl6p\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.756266 master-0 kubenswrapper[13984]: I0312 12:47:31.756193 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-operator-scripts\") pod \"nova-cell1-db-create-nchvw\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.756310 master-0 kubenswrapper[13984]: I0312 12:47:31.756297 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413ffd1a-0d7d-4ee6-8868-1828490695d6-operator-scripts\") pod \"nova-cell0-db-create-tkj85\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.756583 master-0 kubenswrapper[13984]: I0312 12:47:31.756558 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7br7d\" (UniqueName: \"kubernetes.io/projected/413ffd1a-0d7d-4ee6-8868-1828490695d6-kube-api-access-7br7d\") pod \"nova-cell0-db-create-tkj85\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.757444 master-0 kubenswrapper[13984]: I0312 12:47:31.757404 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413ffd1a-0d7d-4ee6-8868-1828490695d6-operator-scripts\") pod \"nova-cell0-db-create-tkj85\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.782523 master-0 kubenswrapper[13984]: I0312 12:47:31.779349 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7br7d\" (UniqueName: \"kubernetes.io/projected/413ffd1a-0d7d-4ee6-8868-1828490695d6-kube-api-access-7br7d\") pod \"nova-cell0-db-create-tkj85\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.794091 master-0 kubenswrapper[13984]: I0312 12:47:31.791146 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:31.811350 master-0 kubenswrapper[13984]: I0312 12:47:31.811283 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-0b4a-account-create-update-4zbjs"] Mar 12 12:47:31.817301 master-0 kubenswrapper[13984]: I0312 12:47:31.813269 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:31.817301 master-0 kubenswrapper[13984]: I0312 12:47:31.815939 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 12 12:47:31.855187 master-0 kubenswrapper[13984]: I0312 12:47:31.855123 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:31.856736 master-0 kubenswrapper[13984]: I0312 12:47:31.856668 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0b4a-account-create-update-4zbjs"] Mar 12 12:47:31.859508 master-0 kubenswrapper[13984]: I0312 12:47:31.859232 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cr76\" (UniqueName: \"kubernetes.io/projected/c00d2711-454c-4bc3-9db9-55fa8537ecad-kube-api-access-7cr76\") pod \"nova-cell0-642a-account-create-update-lwl6p\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.859508 master-0 kubenswrapper[13984]: I0312 12:47:31.859338 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rszrw\" (UniqueName: \"kubernetes.io/projected/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-kube-api-access-rszrw\") pod \"nova-cell1-db-create-nchvw\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.859508 master-0 kubenswrapper[13984]: I0312 12:47:31.859398 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00d2711-454c-4bc3-9db9-55fa8537ecad-operator-scripts\") pod \"nova-cell0-642a-account-create-update-lwl6p\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.859508 master-0 kubenswrapper[13984]: I0312 12:47:31.859472 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-operator-scripts\") pod \"nova-cell1-db-create-nchvw\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.861247 master-0 kubenswrapper[13984]: I0312 12:47:31.860596 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00d2711-454c-4bc3-9db9-55fa8537ecad-operator-scripts\") pod \"nova-cell0-642a-account-create-update-lwl6p\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.861247 master-0 kubenswrapper[13984]: I0312 12:47:31.860689 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-operator-scripts\") pod \"nova-cell1-db-create-nchvw\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.890017 master-0 kubenswrapper[13984]: I0312 12:47:31.888764 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rszrw\" (UniqueName: \"kubernetes.io/projected/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-kube-api-access-rszrw\") pod \"nova-cell1-db-create-nchvw\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.890017 master-0 kubenswrapper[13984]: I0312 12:47:31.889241 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cr76\" (UniqueName: \"kubernetes.io/projected/c00d2711-454c-4bc3-9db9-55fa8537ecad-kube-api-access-7cr76\") pod \"nova-cell0-642a-account-create-update-lwl6p\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:31.965476 master-0 kubenswrapper[13984]: I0312 12:47:31.965277 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:31.968203 master-0 kubenswrapper[13984]: I0312 12:47:31.968119 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw5jk\" (UniqueName: \"kubernetes.io/projected/7180a2fd-8928-4b52-a707-a08eb7230a3b-kube-api-access-kw5jk\") pod \"nova-cell1-0b4a-account-create-update-4zbjs\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:31.968606 master-0 kubenswrapper[13984]: I0312 12:47:31.968286 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7180a2fd-8928-4b52-a707-a08eb7230a3b-operator-scripts\") pod \"nova-cell1-0b4a-account-create-update-4zbjs\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:32.044668 master-0 kubenswrapper[13984]: I0312 12:47:32.037713 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:32.074428 master-0 kubenswrapper[13984]: I0312 12:47:32.073996 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7180a2fd-8928-4b52-a707-a08eb7230a3b-operator-scripts\") pod \"nova-cell1-0b4a-account-create-update-4zbjs\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:32.074428 master-0 kubenswrapper[13984]: I0312 12:47:32.074175 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw5jk\" (UniqueName: \"kubernetes.io/projected/7180a2fd-8928-4b52-a707-a08eb7230a3b-kube-api-access-kw5jk\") pod \"nova-cell1-0b4a-account-create-update-4zbjs\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:32.075254 master-0 kubenswrapper[13984]: I0312 12:47:32.075229 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7180a2fd-8928-4b52-a707-a08eb7230a3b-operator-scripts\") pod \"nova-cell1-0b4a-account-create-update-4zbjs\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:32.092202 master-0 kubenswrapper[13984]: I0312 12:47:32.092151 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:32.098077 master-0 kubenswrapper[13984]: I0312 12:47:32.098030 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw5jk\" (UniqueName: \"kubernetes.io/projected/7180a2fd-8928-4b52-a707-a08eb7230a3b-kube-api-access-kw5jk\") pod \"nova-cell1-0b4a-account-create-update-4zbjs\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:32.151947 master-0 kubenswrapper[13984]: I0312 12:47:32.151331 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290216 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-logs\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290288 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/64252643-1a6d-4283-a6be-bdb4adbac235-etc-podinfo\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290344 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-custom\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290515 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-combined-ca-bundle\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290564 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-merged\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290643 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-scripts\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290689 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gr5c2\" (UniqueName: \"kubernetes.io/projected/64252643-1a6d-4283-a6be-bdb4adbac235-kube-api-access-gr5c2\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.290770 master-0 kubenswrapper[13984]: I0312 12:47:32.290736 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data\") pod \"64252643-1a6d-4283-a6be-bdb4adbac235\" (UID: \"64252643-1a6d-4283-a6be-bdb4adbac235\") " Mar 12 12:47:32.296866 master-0 kubenswrapper[13984]: I0312 12:47:32.296078 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:32.318528 master-0 kubenswrapper[13984]: I0312 12:47:32.307238 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-logs" (OuterVolumeSpecName: "logs") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:32.337538 master-0 kubenswrapper[13984]: I0312 12:47:32.331359 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/64252643-1a6d-4283-a6be-bdb4adbac235-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 12:47:32.338790 master-0 kubenswrapper[13984]: I0312 12:47:32.338695 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-scripts" (OuterVolumeSpecName: "scripts") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:32.348567 master-0 kubenswrapper[13984]: I0312 12:47:32.348196 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64252643-1a6d-4283-a6be-bdb4adbac235-kube-api-access-gr5c2" (OuterVolumeSpecName: "kube-api-access-gr5c2") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "kube-api-access-gr5c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:32.348567 master-0 kubenswrapper[13984]: I0312 12:47:32.348339 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:32.393371 master-0 kubenswrapper[13984]: I0312 12:47:32.393286 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.393371 master-0 kubenswrapper[13984]: I0312 12:47:32.393331 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.393371 master-0 kubenswrapper[13984]: I0312 12:47:32.393341 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gr5c2\" (UniqueName: \"kubernetes.io/projected/64252643-1a6d-4283-a6be-bdb4adbac235-kube-api-access-gr5c2\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.393371 master-0 kubenswrapper[13984]: I0312 12:47:32.393350 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/64252643-1a6d-4283-a6be-bdb4adbac235-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.393371 master-0 kubenswrapper[13984]: I0312 12:47:32.393360 13984 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/64252643-1a6d-4283-a6be-bdb4adbac235-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.393371 master-0 kubenswrapper[13984]: I0312 12:47:32.393369 13984 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.394298 master-0 kubenswrapper[13984]: I0312 12:47:32.394252 13984 generic.go:334] "Generic (PLEG): container finished" podID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerID="53ea1660947e8524c6066f7f907dc7ab036b6030e398ed32f2b7a1b487336f0f" exitCode=0 Mar 12 12:47:32.394367 master-0 kubenswrapper[13984]: I0312 12:47:32.394321 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f777f5fbd-l96tl" event={"ID":"91599efa-a74b-48bc-8553-58d2c33e1e75","Type":"ContainerDied","Data":"53ea1660947e8524c6066f7f907dc7ab036b6030e398ed32f2b7a1b487336f0f"} Mar 12 12:47:32.412999 master-0 kubenswrapper[13984]: I0312 12:47:32.412886 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-59d99f4857-zbcnf" event={"ID":"64252643-1a6d-4283-a6be-bdb4adbac235","Type":"ContainerDied","Data":"b8252ac36025fc4cdb429bad9b3730fd3e3cf80a5d96db41ff05f12c2d4a2975"} Mar 12 12:47:32.412999 master-0 kubenswrapper[13984]: I0312 12:47:32.412950 13984 scope.go:117] "RemoveContainer" containerID="91f4677cb9cc34c2283212e4c52245d1c898f39124b170d19dc2b0f5a76f53ae" Mar 12 12:47:32.413404 master-0 kubenswrapper[13984]: I0312 12:47:32.413034 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-59d99f4857-zbcnf" Mar 12 12:47:32.437055 master-0 kubenswrapper[13984]: I0312 12:47:32.435946 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data" (OuterVolumeSpecName: "config-data") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:32.499975 master-0 kubenswrapper[13984]: I0312 12:47:32.497789 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.547648 master-0 kubenswrapper[13984]: I0312 12:47:32.545757 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64252643-1a6d-4283-a6be-bdb4adbac235" (UID: "64252643-1a6d-4283-a6be-bdb4adbac235"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:32.602832 master-0 kubenswrapper[13984]: I0312 12:47:32.599684 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64252643-1a6d-4283-a6be-bdb4adbac235-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:32.784566 master-0 kubenswrapper[13984]: I0312 12:47:32.783712 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-59d99f4857-zbcnf"] Mar 12 12:47:32.797544 master-0 kubenswrapper[13984]: I0312 12:47:32.796361 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-59d99f4857-zbcnf"] Mar 12 12:47:33.998630 master-0 kubenswrapper[13984]: I0312 12:47:33.998574 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" path="/var/lib/kubelet/pods/64252643-1a6d-4283-a6be-bdb4adbac235/volumes" Mar 12 12:47:35.099937 master-0 kubenswrapper[13984]: I0312 12:47:35.099879 13984 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:35.101051 master-0 kubenswrapper[13984]: I0312 12:47:35.099949 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:35.101051 master-0 kubenswrapper[13984]: I0312 12:47:35.100590 13984 scope.go:117] "RemoveContainer" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" Mar 12 12:47:35.101051 master-0 kubenswrapper[13984]: E0312 12:47:35.100894 13984 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-776949857-qhnzl_openstack(4cecbed0-f638-495f-a450-ddcb64f6cc30)\"" pod="openstack/ironic-neutron-agent-776949857-qhnzl" podUID="4cecbed0-f638-495f-a450-ddcb64f6cc30" Mar 12 12:47:35.594511 master-0 kubenswrapper[13984]: I0312 12:47:35.593717 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:35.597103 master-0 kubenswrapper[13984]: I0312 12:47:35.595773 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-bcc965b68-5qb2v" Mar 12 12:47:39.695652 master-0 kubenswrapper[13984]: I0312 12:47:39.695608 13984 scope.go:117] "RemoveContainer" containerID="4f36b1500967347631bfc950041c38dee15068bcaed2efe17938b6bc7aa4d799" Mar 12 12:47:39.956671 master-0 kubenswrapper[13984]: I0312 12:47:39.956345 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:47:40.036877 master-0 kubenswrapper[13984]: I0312 12:47:40.036810 13984 scope.go:117] "RemoveContainer" containerID="2ae84dcc424826dece2a433c245097faf7d15c2363c7bfb4286e140326b977d3" Mar 12 12:47:40.051782 master-0 kubenswrapper[13984]: I0312 12:47:40.051711 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-httpd-config\") pod \"91599efa-a74b-48bc-8553-58d2c33e1e75\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " Mar 12 12:47:40.051782 master-0 kubenswrapper[13984]: I0312 12:47:40.051780 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92n8z\" (UniqueName: \"kubernetes.io/projected/91599efa-a74b-48bc-8553-58d2c33e1e75-kube-api-access-92n8z\") pod \"91599efa-a74b-48bc-8553-58d2c33e1e75\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " Mar 12 12:47:40.052071 master-0 kubenswrapper[13984]: I0312 12:47:40.051905 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-combined-ca-bundle\") pod \"91599efa-a74b-48bc-8553-58d2c33e1e75\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " Mar 12 12:47:40.052189 master-0 kubenswrapper[13984]: I0312 12:47:40.052173 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-config\") pod \"91599efa-a74b-48bc-8553-58d2c33e1e75\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " Mar 12 12:47:40.052257 master-0 kubenswrapper[13984]: I0312 12:47:40.052245 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-ovndb-tls-certs\") pod \"91599efa-a74b-48bc-8553-58d2c33e1e75\" (UID: \"91599efa-a74b-48bc-8553-58d2c33e1e75\") " Mar 12 12:47:40.069428 master-0 kubenswrapper[13984]: I0312 12:47:40.069286 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "91599efa-a74b-48bc-8553-58d2c33e1e75" (UID: "91599efa-a74b-48bc-8553-58d2c33e1e75"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:40.087093 master-0 kubenswrapper[13984]: I0312 12:47:40.086948 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91599efa-a74b-48bc-8553-58d2c33e1e75-kube-api-access-92n8z" (OuterVolumeSpecName: "kube-api-access-92n8z") pod "91599efa-a74b-48bc-8553-58d2c33e1e75" (UID: "91599efa-a74b-48bc-8553-58d2c33e1e75"). InnerVolumeSpecName "kube-api-access-92n8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:40.134657 master-0 kubenswrapper[13984]: I0312 12:47:40.134322 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "91599efa-a74b-48bc-8553-58d2c33e1e75" (UID: "91599efa-a74b-48bc-8553-58d2c33e1e75"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:40.161694 master-0 kubenswrapper[13984]: I0312 12:47:40.161642 13984 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:40.161892 master-0 kubenswrapper[13984]: I0312 12:47:40.161863 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92n8z\" (UniqueName: \"kubernetes.io/projected/91599efa-a74b-48bc-8553-58d2c33e1e75-kube-api-access-92n8z\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:40.161945 master-0 kubenswrapper[13984]: I0312 12:47:40.161906 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:40.176056 master-0 kubenswrapper[13984]: I0312 12:47:40.175891 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-config" (OuterVolumeSpecName: "config") pod "91599efa-a74b-48bc-8553-58d2c33e1e75" (UID: "91599efa-a74b-48bc-8553-58d2c33e1e75"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:40.229874 master-0 kubenswrapper[13984]: I0312 12:47:40.229794 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "91599efa-a74b-48bc-8553-58d2c33e1e75" (UID: "91599efa-a74b-48bc-8553-58d2c33e1e75"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:40.263658 master-0 kubenswrapper[13984]: I0312 12:47:40.263529 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:40.263658 master-0 kubenswrapper[13984]: I0312 12:47:40.263567 13984 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/91599efa-a74b-48bc-8553-58d2c33e1e75-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:40.500616 master-0 kubenswrapper[13984]: I0312 12:47:40.500556 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-qfsbw"] Mar 12 12:47:40.510570 master-0 kubenswrapper[13984]: I0312 12:47:40.510522 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-tkj85"] Mar 12 12:47:40.519209 master-0 kubenswrapper[13984]: W0312 12:47:40.519168 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod413ffd1a_0d7d_4ee6_8868_1828490695d6.slice/crio-334270c9211936b3e23d5aab01423b30798cda9d23bf92f2d5058e5812c72f6b WatchSource:0}: Error finding container 334270c9211936b3e23d5aab01423b30798cda9d23bf92f2d5058e5812c72f6b: Status 404 returned error can't find the container with id 334270c9211936b3e23d5aab01423b30798cda9d23bf92f2d5058e5812c72f6b Mar 12 12:47:40.576888 master-0 kubenswrapper[13984]: I0312 12:47:40.575247 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7f777f5fbd-l96tl" event={"ID":"91599efa-a74b-48bc-8553-58d2c33e1e75","Type":"ContainerDied","Data":"1cc2927014a05f3bc3e8b79591ec5ac31fe96ee1adf0d1cea15fad6b29e8653b"} Mar 12 12:47:40.576888 master-0 kubenswrapper[13984]: I0312 12:47:40.575304 13984 scope.go:117] "RemoveContainer" containerID="2920d12c181fbe99698f1a8120e7ea3dabc0637c05f3659d64093bb6d59a8101" Mar 12 12:47:40.576888 master-0 kubenswrapper[13984]: I0312 12:47:40.575466 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7f777f5fbd-l96tl" Mar 12 12:47:40.596589 master-0 kubenswrapper[13984]: I0312 12:47:40.596170 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tkj85" event={"ID":"413ffd1a-0d7d-4ee6-8868-1828490695d6","Type":"ContainerStarted","Data":"334270c9211936b3e23d5aab01423b30798cda9d23bf92f2d5058e5812c72f6b"} Mar 12 12:47:40.615961 master-0 kubenswrapper[13984]: I0312 12:47:40.599878 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qfsbw" event={"ID":"89d18698-6635-4aba-a43e-eaf81cf72796","Type":"ContainerStarted","Data":"b84c9f5a8a56853985ba404940b30fb1e51e609957a485e3cc082e0f5b28d6ef"} Mar 12 12:47:40.615961 master-0 kubenswrapper[13984]: I0312 12:47:40.607149 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1f0d75f9-805b-4886-9bff-8544337a2fd1","Type":"ContainerStarted","Data":"ae2e4bb3b0e4ab5afddc67e13005988cc738b333a10d1ea47c918e37e0be74fe"} Mar 12 12:47:40.634732 master-0 kubenswrapper[13984]: I0312 12:47:40.633413 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6crfp" event={"ID":"5bd44f3d-5677-4edd-8e29-8f010f3bfed1","Type":"ContainerStarted","Data":"8a4c54e13be7e71a268f2bc35c6a9b7c986f7801c131f0c086f2968d63cb28cb"} Mar 12 12:47:40.635118 master-0 kubenswrapper[13984]: I0312 12:47:40.635050 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.186149929 podStartE2EDuration="20.635028924s" podCreationTimestamp="2026-03-12 12:47:20 +0000 UTC" firstStartedPulling="2026-03-12 12:47:21.3716246 +0000 UTC m=+1373.569640092" lastFinishedPulling="2026-03-12 12:47:39.820503595 +0000 UTC m=+1392.018519087" observedRunningTime="2026-03-12 12:47:40.632868436 +0000 UTC m=+1392.830883928" watchObservedRunningTime="2026-03-12 12:47:40.635028924 +0000 UTC m=+1392.833044436" Mar 12 12:47:40.650793 master-0 kubenswrapper[13984]: I0312 12:47:40.650553 13984 scope.go:117] "RemoveContainer" containerID="53ea1660947e8524c6066f7f907dc7ab036b6030e398ed32f2b7a1b487336f0f" Mar 12 12:47:40.700134 master-0 kubenswrapper[13984]: I0312 12:47:40.700072 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7f777f5fbd-l96tl"] Mar 12 12:47:40.721550 master-0 kubenswrapper[13984]: I0312 12:47:40.721453 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7f777f5fbd-l96tl"] Mar 12 12:47:40.957412 master-0 kubenswrapper[13984]: I0312 12:47:40.955773 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-6crfp" podStartSLOduration=2.510664931 podStartE2EDuration="16.955744996s" podCreationTimestamp="2026-03-12 12:47:24 +0000 UTC" firstStartedPulling="2026-03-12 12:47:25.33208626 +0000 UTC m=+1377.530101752" lastFinishedPulling="2026-03-12 12:47:39.777166325 +0000 UTC m=+1391.975181817" observedRunningTime="2026-03-12 12:47:40.739069914 +0000 UTC m=+1392.937085406" watchObservedRunningTime="2026-03-12 12:47:40.955744996 +0000 UTC m=+1393.153760488" Mar 12 12:47:40.974328 master-0 kubenswrapper[13984]: I0312 12:47:40.974279 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-0b4a-account-create-update-4zbjs"] Mar 12 12:47:40.997648 master-0 kubenswrapper[13984]: I0312 12:47:40.997325 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-nchvw"] Mar 12 12:47:41.021627 master-0 kubenswrapper[13984]: I0312 12:47:41.021271 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-642a-account-create-update-lwl6p"] Mar 12 12:47:41.091929 master-0 kubenswrapper[13984]: I0312 12:47:41.090443 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-162c-account-create-update-jzxl9"] Mar 12 12:47:41.659218 master-0 kubenswrapper[13984]: I0312 12:47:41.659119 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-162c-account-create-update-jzxl9" event={"ID":"afbeb3b5-acd9-4095-bc17-29c1b6a29960","Type":"ContainerStarted","Data":"5fe27ddf623c2249d16ceb8d7d93ceda21fba569acfa6a617db5ba317e683218"} Mar 12 12:47:41.659218 master-0 kubenswrapper[13984]: I0312 12:47:41.659194 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-162c-account-create-update-jzxl9" event={"ID":"afbeb3b5-acd9-4095-bc17-29c1b6a29960","Type":"ContainerStarted","Data":"74a30656558ef62d2394ba8096515199afd42f6c6427c366902c27425f2add3c"} Mar 12 12:47:41.663698 master-0 kubenswrapper[13984]: I0312 12:47:41.663520 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" event={"ID":"c00d2711-454c-4bc3-9db9-55fa8537ecad","Type":"ContainerStarted","Data":"00a916cf08c787c878b3d1e99a991d2bfef30e1ad62b1dec6ffae0b05853b0c7"} Mar 12 12:47:41.663787 master-0 kubenswrapper[13984]: I0312 12:47:41.663756 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" event={"ID":"c00d2711-454c-4bc3-9db9-55fa8537ecad","Type":"ContainerStarted","Data":"6f0e8eb61b39f4596af6ad67a8272cd5c5c8badebaec177c8a6e70f75be38f89"} Mar 12 12:47:41.667123 master-0 kubenswrapper[13984]: I0312 12:47:41.667069 13984 generic.go:334] "Generic (PLEG): container finished" podID="9cba6f06-5a3b-4ebc-b36b-779fb17b727b" containerID="7774f5189f07503a5be7cf5fd874b36be3dde3e9f1bd75b62d526391fc4b77b9" exitCode=0 Mar 12 12:47:41.667187 master-0 kubenswrapper[13984]: I0312 12:47:41.667149 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nchvw" event={"ID":"9cba6f06-5a3b-4ebc-b36b-779fb17b727b","Type":"ContainerDied","Data":"7774f5189f07503a5be7cf5fd874b36be3dde3e9f1bd75b62d526391fc4b77b9"} Mar 12 12:47:41.667231 master-0 kubenswrapper[13984]: I0312 12:47:41.667185 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nchvw" event={"ID":"9cba6f06-5a3b-4ebc-b36b-779fb17b727b","Type":"ContainerStarted","Data":"a60e3de2248b91991011ec9866b2b61783013476497f751e848f6a588e165e62"} Mar 12 12:47:41.671487 master-0 kubenswrapper[13984]: I0312 12:47:41.670989 13984 generic.go:334] "Generic (PLEG): container finished" podID="89d18698-6635-4aba-a43e-eaf81cf72796" containerID="95cc072a0af7eec50922bd4c78e64986c17a44a84018b2ae54d95778097e18fc" exitCode=0 Mar 12 12:47:41.671487 master-0 kubenswrapper[13984]: I0312 12:47:41.671069 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qfsbw" event={"ID":"89d18698-6635-4aba-a43e-eaf81cf72796","Type":"ContainerDied","Data":"95cc072a0af7eec50922bd4c78e64986c17a44a84018b2ae54d95778097e18fc"} Mar 12 12:47:41.674745 master-0 kubenswrapper[13984]: I0312 12:47:41.674209 13984 generic.go:334] "Generic (PLEG): container finished" podID="413ffd1a-0d7d-4ee6-8868-1828490695d6" containerID="353124c3b2cf0e9a67d6bffb0d187e827bc3ac5edd0177d931a0e3d39842e76b" exitCode=0 Mar 12 12:47:41.674745 master-0 kubenswrapper[13984]: I0312 12:47:41.674302 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tkj85" event={"ID":"413ffd1a-0d7d-4ee6-8868-1828490695d6","Type":"ContainerDied","Data":"353124c3b2cf0e9a67d6bffb0d187e827bc3ac5edd0177d931a0e3d39842e76b"} Mar 12 12:47:41.677956 master-0 kubenswrapper[13984]: I0312 12:47:41.677671 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" event={"ID":"7180a2fd-8928-4b52-a707-a08eb7230a3b","Type":"ContainerStarted","Data":"c869af247970458cfd48246ea8d78f4ca160ab8e51e7469dc7f7e52b19b97bbc"} Mar 12 12:47:41.677956 master-0 kubenswrapper[13984]: I0312 12:47:41.677713 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" event={"ID":"7180a2fd-8928-4b52-a707-a08eb7230a3b","Type":"ContainerStarted","Data":"507be2677d5f5acda51f8cfefe5ddfcb213098eec13148be50cc28728f262c3e"} Mar 12 12:47:41.688052 master-0 kubenswrapper[13984]: I0312 12:47:41.687941 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-162c-account-create-update-jzxl9" podStartSLOduration=10.687913881 podStartE2EDuration="10.687913881s" podCreationTimestamp="2026-03-12 12:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:41.681536927 +0000 UTC m=+1393.879552419" watchObservedRunningTime="2026-03-12 12:47:41.687913881 +0000 UTC m=+1393.885929373" Mar 12 12:47:41.739229 master-0 kubenswrapper[13984]: I0312 12:47:41.739085 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" podStartSLOduration=10.739028659 podStartE2EDuration="10.739028659s" podCreationTimestamp="2026-03-12 12:47:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:41.738339287 +0000 UTC m=+1393.936354779" watchObservedRunningTime="2026-03-12 12:47:41.739028659 +0000 UTC m=+1393.937044151" Mar 12 12:47:42.008028 master-0 kubenswrapper[13984]: I0312 12:47:42.007852 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" path="/var/lib/kubelet/pods/91599efa-a74b-48bc-8553-58d2c33e1e75/volumes" Mar 12 12:47:42.696805 master-0 kubenswrapper[13984]: I0312 12:47:42.696570 13984 generic.go:334] "Generic (PLEG): container finished" podID="7180a2fd-8928-4b52-a707-a08eb7230a3b" containerID="c869af247970458cfd48246ea8d78f4ca160ab8e51e7469dc7f7e52b19b97bbc" exitCode=0 Mar 12 12:47:42.696805 master-0 kubenswrapper[13984]: I0312 12:47:42.696652 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" event={"ID":"7180a2fd-8928-4b52-a707-a08eb7230a3b","Type":"ContainerDied","Data":"c869af247970458cfd48246ea8d78f4ca160ab8e51e7469dc7f7e52b19b97bbc"} Mar 12 12:47:42.706732 master-0 kubenswrapper[13984]: I0312 12:47:42.706635 13984 generic.go:334] "Generic (PLEG): container finished" podID="afbeb3b5-acd9-4095-bc17-29c1b6a29960" containerID="5fe27ddf623c2249d16ceb8d7d93ceda21fba569acfa6a617db5ba317e683218" exitCode=0 Mar 12 12:47:42.706732 master-0 kubenswrapper[13984]: I0312 12:47:42.706717 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-162c-account-create-update-jzxl9" event={"ID":"afbeb3b5-acd9-4095-bc17-29c1b6a29960","Type":"ContainerDied","Data":"5fe27ddf623c2249d16ceb8d7d93ceda21fba569acfa6a617db5ba317e683218"} Mar 12 12:47:42.711620 master-0 kubenswrapper[13984]: I0312 12:47:42.711573 13984 generic.go:334] "Generic (PLEG): container finished" podID="c00d2711-454c-4bc3-9db9-55fa8537ecad" containerID="00a916cf08c787c878b3d1e99a991d2bfef30e1ad62b1dec6ffae0b05853b0c7" exitCode=0 Mar 12 12:47:42.711726 master-0 kubenswrapper[13984]: I0312 12:47:42.711703 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" event={"ID":"c00d2711-454c-4bc3-9db9-55fa8537ecad","Type":"ContainerDied","Data":"00a916cf08c787c878b3d1e99a991d2bfef30e1ad62b1dec6ffae0b05853b0c7"} Mar 12 12:47:43.401409 master-0 kubenswrapper[13984]: I0312 12:47:43.400943 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:43.512345 master-0 kubenswrapper[13984]: I0312 12:47:43.509737 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d18698-6635-4aba-a43e-eaf81cf72796-operator-scripts\") pod \"89d18698-6635-4aba-a43e-eaf81cf72796\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " Mar 12 12:47:43.512345 master-0 kubenswrapper[13984]: I0312 12:47:43.510079 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4wxz\" (UniqueName: \"kubernetes.io/projected/89d18698-6635-4aba-a43e-eaf81cf72796-kube-api-access-b4wxz\") pod \"89d18698-6635-4aba-a43e-eaf81cf72796\" (UID: \"89d18698-6635-4aba-a43e-eaf81cf72796\") " Mar 12 12:47:43.514420 master-0 kubenswrapper[13984]: I0312 12:47:43.514350 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89d18698-6635-4aba-a43e-eaf81cf72796-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "89d18698-6635-4aba-a43e-eaf81cf72796" (UID: "89d18698-6635-4aba-a43e-eaf81cf72796"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:43.518172 master-0 kubenswrapper[13984]: I0312 12:47:43.518139 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89d18698-6635-4aba-a43e-eaf81cf72796-kube-api-access-b4wxz" (OuterVolumeSpecName: "kube-api-access-b4wxz") pod "89d18698-6635-4aba-a43e-eaf81cf72796" (UID: "89d18698-6635-4aba-a43e-eaf81cf72796"). InnerVolumeSpecName "kube-api-access-b4wxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:43.614393 master-0 kubenswrapper[13984]: I0312 12:47:43.614322 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/89d18698-6635-4aba-a43e-eaf81cf72796-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.614393 master-0 kubenswrapper[13984]: I0312 12:47:43.614395 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4wxz\" (UniqueName: \"kubernetes.io/projected/89d18698-6635-4aba-a43e-eaf81cf72796-kube-api-access-b4wxz\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.677048 master-0 kubenswrapper[13984]: I0312 12:47:43.677001 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:43.703267 master-0 kubenswrapper[13984]: I0312 12:47:43.703231 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:43.709133 master-0 kubenswrapper[13984]: I0312 12:47:43.708759 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:43.727162 master-0 kubenswrapper[13984]: I0312 12:47:43.725876 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-qfsbw" event={"ID":"89d18698-6635-4aba-a43e-eaf81cf72796","Type":"ContainerDied","Data":"b84c9f5a8a56853985ba404940b30fb1e51e609957a485e3cc082e0f5b28d6ef"} Mar 12 12:47:43.727162 master-0 kubenswrapper[13984]: I0312 12:47:43.725947 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b84c9f5a8a56853985ba404940b30fb1e51e609957a485e3cc082e0f5b28d6ef" Mar 12 12:47:43.727162 master-0 kubenswrapper[13984]: I0312 12:47:43.726044 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-qfsbw" Mar 12 12:47:43.730394 master-0 kubenswrapper[13984]: I0312 12:47:43.730353 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-tkj85" Mar 12 12:47:43.730506 master-0 kubenswrapper[13984]: I0312 12:47:43.730350 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-tkj85" event={"ID":"413ffd1a-0d7d-4ee6-8868-1828490695d6","Type":"ContainerDied","Data":"334270c9211936b3e23d5aab01423b30798cda9d23bf92f2d5058e5812c72f6b"} Mar 12 12:47:43.730561 master-0 kubenswrapper[13984]: I0312 12:47:43.730522 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334270c9211936b3e23d5aab01423b30798cda9d23bf92f2d5058e5812c72f6b" Mar 12 12:47:43.735716 master-0 kubenswrapper[13984]: I0312 12:47:43.735692 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" event={"ID":"7180a2fd-8928-4b52-a707-a08eb7230a3b","Type":"ContainerDied","Data":"507be2677d5f5acda51f8cfefe5ddfcb213098eec13148be50cc28728f262c3e"} Mar 12 12:47:43.735827 master-0 kubenswrapper[13984]: I0312 12:47:43.735805 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="507be2677d5f5acda51f8cfefe5ddfcb213098eec13148be50cc28728f262c3e" Mar 12 12:47:43.735908 master-0 kubenswrapper[13984]: I0312 12:47:43.735851 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-0b4a-account-create-update-4zbjs" Mar 12 12:47:43.739683 master-0 kubenswrapper[13984]: I0312 12:47:43.739652 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-nchvw" Mar 12 12:47:43.740246 master-0 kubenswrapper[13984]: I0312 12:47:43.739885 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-nchvw" event={"ID":"9cba6f06-5a3b-4ebc-b36b-779fb17b727b","Type":"ContainerDied","Data":"a60e3de2248b91991011ec9866b2b61783013476497f751e848f6a588e165e62"} Mar 12 12:47:43.740246 master-0 kubenswrapper[13984]: I0312 12:47:43.739975 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a60e3de2248b91991011ec9866b2b61783013476497f751e848f6a588e165e62" Mar 12 12:47:43.742699 master-0 kubenswrapper[13984]: I0312 12:47:43.742644 13984 generic.go:334] "Generic (PLEG): container finished" podID="5bd44f3d-5677-4edd-8e29-8f010f3bfed1" containerID="8a4c54e13be7e71a268f2bc35c6a9b7c986f7801c131f0c086f2968d63cb28cb" exitCode=0 Mar 12 12:47:43.742784 master-0 kubenswrapper[13984]: I0312 12:47:43.742737 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6crfp" event={"ID":"5bd44f3d-5677-4edd-8e29-8f010f3bfed1","Type":"ContainerDied","Data":"8a4c54e13be7e71a268f2bc35c6a9b7c986f7801c131f0c086f2968d63cb28cb"} Mar 12 12:47:43.837215 master-0 kubenswrapper[13984]: I0312 12:47:43.837051 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413ffd1a-0d7d-4ee6-8868-1828490695d6-operator-scripts\") pod \"413ffd1a-0d7d-4ee6-8868-1828490695d6\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " Mar 12 12:47:43.837215 master-0 kubenswrapper[13984]: I0312 12:47:43.837164 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7br7d\" (UniqueName: \"kubernetes.io/projected/413ffd1a-0d7d-4ee6-8868-1828490695d6-kube-api-access-7br7d\") pod \"413ffd1a-0d7d-4ee6-8868-1828490695d6\" (UID: \"413ffd1a-0d7d-4ee6-8868-1828490695d6\") " Mar 12 12:47:43.837215 master-0 kubenswrapper[13984]: I0312 12:47:43.837194 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7180a2fd-8928-4b52-a707-a08eb7230a3b-operator-scripts\") pod \"7180a2fd-8928-4b52-a707-a08eb7230a3b\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " Mar 12 12:47:43.837466 master-0 kubenswrapper[13984]: I0312 12:47:43.837370 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-operator-scripts\") pod \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " Mar 12 12:47:43.837466 master-0 kubenswrapper[13984]: I0312 12:47:43.837401 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rszrw\" (UniqueName: \"kubernetes.io/projected/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-kube-api-access-rszrw\") pod \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\" (UID: \"9cba6f06-5a3b-4ebc-b36b-779fb17b727b\") " Mar 12 12:47:43.837593 master-0 kubenswrapper[13984]: I0312 12:47:43.837518 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw5jk\" (UniqueName: \"kubernetes.io/projected/7180a2fd-8928-4b52-a707-a08eb7230a3b-kube-api-access-kw5jk\") pod \"7180a2fd-8928-4b52-a707-a08eb7230a3b\" (UID: \"7180a2fd-8928-4b52-a707-a08eb7230a3b\") " Mar 12 12:47:43.844589 master-0 kubenswrapper[13984]: I0312 12:47:43.842763 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7180a2fd-8928-4b52-a707-a08eb7230a3b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7180a2fd-8928-4b52-a707-a08eb7230a3b" (UID: "7180a2fd-8928-4b52-a707-a08eb7230a3b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:43.845320 master-0 kubenswrapper[13984]: I0312 12:47:43.844928 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cba6f06-5a3b-4ebc-b36b-779fb17b727b" (UID: "9cba6f06-5a3b-4ebc-b36b-779fb17b727b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:43.846348 master-0 kubenswrapper[13984]: I0312 12:47:43.846318 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/413ffd1a-0d7d-4ee6-8868-1828490695d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "413ffd1a-0d7d-4ee6-8868-1828490695d6" (UID: "413ffd1a-0d7d-4ee6-8868-1828490695d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:43.855817 master-0 kubenswrapper[13984]: I0312 12:47:43.855679 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-kube-api-access-rszrw" (OuterVolumeSpecName: "kube-api-access-rszrw") pod "9cba6f06-5a3b-4ebc-b36b-779fb17b727b" (UID: "9cba6f06-5a3b-4ebc-b36b-779fb17b727b"). InnerVolumeSpecName "kube-api-access-rszrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:43.891790 master-0 kubenswrapper[13984]: I0312 12:47:43.891653 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7180a2fd-8928-4b52-a707-a08eb7230a3b-kube-api-access-kw5jk" (OuterVolumeSpecName: "kube-api-access-kw5jk") pod "7180a2fd-8928-4b52-a707-a08eb7230a3b" (UID: "7180a2fd-8928-4b52-a707-a08eb7230a3b"). InnerVolumeSpecName "kube-api-access-kw5jk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:43.891790 master-0 kubenswrapper[13984]: I0312 12:47:43.891725 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/413ffd1a-0d7d-4ee6-8868-1828490695d6-kube-api-access-7br7d" (OuterVolumeSpecName: "kube-api-access-7br7d") pod "413ffd1a-0d7d-4ee6-8868-1828490695d6" (UID: "413ffd1a-0d7d-4ee6-8868-1828490695d6"). InnerVolumeSpecName "kube-api-access-7br7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:43.946890 master-0 kubenswrapper[13984]: I0312 12:47:43.945735 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/413ffd1a-0d7d-4ee6-8868-1828490695d6-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.946890 master-0 kubenswrapper[13984]: I0312 12:47:43.945807 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7br7d\" (UniqueName: \"kubernetes.io/projected/413ffd1a-0d7d-4ee6-8868-1828490695d6-kube-api-access-7br7d\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.946890 master-0 kubenswrapper[13984]: I0312 12:47:43.945827 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7180a2fd-8928-4b52-a707-a08eb7230a3b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.946890 master-0 kubenswrapper[13984]: I0312 12:47:43.945840 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.946890 master-0 kubenswrapper[13984]: I0312 12:47:43.945858 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rszrw\" (UniqueName: \"kubernetes.io/projected/9cba6f06-5a3b-4ebc-b36b-779fb17b727b-kube-api-access-rszrw\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:43.946890 master-0 kubenswrapper[13984]: I0312 12:47:43.945877 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw5jk\" (UniqueName: \"kubernetes.io/projected/7180a2fd-8928-4b52-a707-a08eb7230a3b-kube-api-access-kw5jk\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:44.364156 master-0 kubenswrapper[13984]: I0312 12:47:44.364042 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:44.448219 master-0 kubenswrapper[13984]: I0312 12:47:44.448145 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:44.515520 master-0 kubenswrapper[13984]: I0312 12:47:44.514670 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00d2711-454c-4bc3-9db9-55fa8537ecad-operator-scripts\") pod \"c00d2711-454c-4bc3-9db9-55fa8537ecad\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " Mar 12 12:47:44.515520 master-0 kubenswrapper[13984]: I0312 12:47:44.514831 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cr76\" (UniqueName: \"kubernetes.io/projected/c00d2711-454c-4bc3-9db9-55fa8537ecad-kube-api-access-7cr76\") pod \"c00d2711-454c-4bc3-9db9-55fa8537ecad\" (UID: \"c00d2711-454c-4bc3-9db9-55fa8537ecad\") " Mar 12 12:47:44.515520 master-0 kubenswrapper[13984]: I0312 12:47:44.514889 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zl7cr\" (UniqueName: \"kubernetes.io/projected/afbeb3b5-acd9-4095-bc17-29c1b6a29960-kube-api-access-zl7cr\") pod \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " Mar 12 12:47:44.515520 master-0 kubenswrapper[13984]: I0312 12:47:44.515231 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbeb3b5-acd9-4095-bc17-29c1b6a29960-operator-scripts\") pod \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\" (UID: \"afbeb3b5-acd9-4095-bc17-29c1b6a29960\") " Mar 12 12:47:44.523386 master-0 kubenswrapper[13984]: I0312 12:47:44.519300 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c00d2711-454c-4bc3-9db9-55fa8537ecad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c00d2711-454c-4bc3-9db9-55fa8537ecad" (UID: "c00d2711-454c-4bc3-9db9-55fa8537ecad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:44.523386 master-0 kubenswrapper[13984]: I0312 12:47:44.520008 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afbeb3b5-acd9-4095-bc17-29c1b6a29960-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "afbeb3b5-acd9-4095-bc17-29c1b6a29960" (UID: "afbeb3b5-acd9-4095-bc17-29c1b6a29960"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:47:44.523386 master-0 kubenswrapper[13984]: I0312 12:47:44.521137 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c00d2711-454c-4bc3-9db9-55fa8537ecad-kube-api-access-7cr76" (OuterVolumeSpecName: "kube-api-access-7cr76") pod "c00d2711-454c-4bc3-9db9-55fa8537ecad" (UID: "c00d2711-454c-4bc3-9db9-55fa8537ecad"). InnerVolumeSpecName "kube-api-access-7cr76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:44.527508 master-0 kubenswrapper[13984]: I0312 12:47:44.524706 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afbeb3b5-acd9-4095-bc17-29c1b6a29960-kube-api-access-zl7cr" (OuterVolumeSpecName: "kube-api-access-zl7cr") pod "afbeb3b5-acd9-4095-bc17-29c1b6a29960" (UID: "afbeb3b5-acd9-4095-bc17-29c1b6a29960"). InnerVolumeSpecName "kube-api-access-zl7cr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:44.627036 master-0 kubenswrapper[13984]: I0312 12:47:44.625660 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c00d2711-454c-4bc3-9db9-55fa8537ecad-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:44.627036 master-0 kubenswrapper[13984]: I0312 12:47:44.625775 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cr76\" (UniqueName: \"kubernetes.io/projected/c00d2711-454c-4bc3-9db9-55fa8537ecad-kube-api-access-7cr76\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:44.627036 master-0 kubenswrapper[13984]: I0312 12:47:44.625794 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zl7cr\" (UniqueName: \"kubernetes.io/projected/afbeb3b5-acd9-4095-bc17-29c1b6a29960-kube-api-access-zl7cr\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:44.627036 master-0 kubenswrapper[13984]: I0312 12:47:44.625808 13984 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/afbeb3b5-acd9-4095-bc17-29c1b6a29960-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:44.755085 master-0 kubenswrapper[13984]: I0312 12:47:44.755027 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-162c-account-create-update-jzxl9" event={"ID":"afbeb3b5-acd9-4095-bc17-29c1b6a29960","Type":"ContainerDied","Data":"74a30656558ef62d2394ba8096515199afd42f6c6427c366902c27425f2add3c"} Mar 12 12:47:44.755085 master-0 kubenswrapper[13984]: I0312 12:47:44.755084 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74a30656558ef62d2394ba8096515199afd42f6c6427c366902c27425f2add3c" Mar 12 12:47:44.755360 master-0 kubenswrapper[13984]: I0312 12:47:44.755088 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-162c-account-create-update-jzxl9" Mar 12 12:47:44.758336 master-0 kubenswrapper[13984]: I0312 12:47:44.758283 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" event={"ID":"c00d2711-454c-4bc3-9db9-55fa8537ecad","Type":"ContainerDied","Data":"6f0e8eb61b39f4596af6ad67a8272cd5c5c8badebaec177c8a6e70f75be38f89"} Mar 12 12:47:44.758336 master-0 kubenswrapper[13984]: I0312 12:47:44.758343 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6f0e8eb61b39f4596af6ad67a8272cd5c5c8badebaec177c8a6e70f75be38f89" Mar 12 12:47:44.758594 master-0 kubenswrapper[13984]: I0312 12:47:44.758313 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-642a-account-create-update-lwl6p" Mar 12 12:47:45.153410 master-0 kubenswrapper[13984]: I0312 12:47:45.152084 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:45.239524 master-0 kubenswrapper[13984]: I0312 12:47:45.239430 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.239794 master-0 kubenswrapper[13984]: I0312 12:47:45.239564 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.239794 master-0 kubenswrapper[13984]: I0312 12:47:45.239622 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-etc-podinfo\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.239794 master-0 kubenswrapper[13984]: I0312 12:47:45.239690 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-combined-ca-bundle\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.240071 master-0 kubenswrapper[13984]: I0312 12:47:45.239935 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-scripts\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.240071 master-0 kubenswrapper[13984]: I0312 12:47:45.239966 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kq7f\" (UniqueName: \"kubernetes.io/projected/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-kube-api-access-6kq7f\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.240212 master-0 kubenswrapper[13984]: I0312 12:47:45.240110 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-config\") pod \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\" (UID: \"5bd44f3d-5677-4edd-8e29-8f010f3bfed1\") " Mar 12 12:47:45.242402 master-0 kubenswrapper[13984]: I0312 12:47:45.241623 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:45.242532 master-0 kubenswrapper[13984]: I0312 12:47:45.242405 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:45.249538 master-0 kubenswrapper[13984]: I0312 12:47:45.246653 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-kube-api-access-6kq7f" (OuterVolumeSpecName: "kube-api-access-6kq7f") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "kube-api-access-6kq7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:45.249538 master-0 kubenswrapper[13984]: I0312 12:47:45.248118 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-scripts" (OuterVolumeSpecName: "scripts") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:45.254381 master-0 kubenswrapper[13984]: I0312 12:47:45.254335 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 12:47:45.281852 master-0 kubenswrapper[13984]: I0312 12:47:45.281801 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-config" (OuterVolumeSpecName: "config") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:45.291743 master-0 kubenswrapper[13984]: I0312 12:47:45.291684 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bd44f3d-5677-4edd-8e29-8f010f3bfed1" (UID: "5bd44f3d-5677-4edd-8e29-8f010f3bfed1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:45.343740 master-0 kubenswrapper[13984]: I0312 12:47:45.343675 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.343740 master-0 kubenswrapper[13984]: I0312 12:47:45.343731 13984 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.344052 master-0 kubenswrapper[13984]: I0312 12:47:45.343757 13984 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.344052 master-0 kubenswrapper[13984]: I0312 12:47:45.343772 13984 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.344052 master-0 kubenswrapper[13984]: I0312 12:47:45.343787 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.344052 master-0 kubenswrapper[13984]: I0312 12:47:45.343801 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.344052 master-0 kubenswrapper[13984]: I0312 12:47:45.343882 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kq7f\" (UniqueName: \"kubernetes.io/projected/5bd44f3d-5677-4edd-8e29-8f010f3bfed1-kube-api-access-6kq7f\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:45.769216 master-0 kubenswrapper[13984]: I0312 12:47:45.769139 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-6crfp" event={"ID":"5bd44f3d-5677-4edd-8e29-8f010f3bfed1","Type":"ContainerDied","Data":"9a442509f9d0154f81c97d28bae98238712305d8bce7fc44cd03be5dcf902625"} Mar 12 12:47:45.769216 master-0 kubenswrapper[13984]: I0312 12:47:45.769203 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a442509f9d0154f81c97d28bae98238712305d8bce7fc44cd03be5dcf902625" Mar 12 12:47:45.769755 master-0 kubenswrapper[13984]: I0312 12:47:45.769312 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-6crfp" Mar 12 12:47:46.981263 master-0 kubenswrapper[13984]: I0312 12:47:46.981206 13984 scope.go:117] "RemoveContainer" containerID="f4eef6d8ff3089b6ff9de408999890bf729aa01c13390b10649e537236614ca8" Mar 12 12:47:47.713774 master-0 kubenswrapper[13984]: I0312 12:47:47.713597 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:47:47.721944 master-0 kubenswrapper[13984]: I0312 12:47:47.719951 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-external-api-0" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-log" containerID="cri-o://c1921e384336ad0fddcfac6b8330b593a2d4a272e635610aa7d493e8bde23d84" gracePeriod=30 Mar 12 12:47:47.721944 master-0 kubenswrapper[13984]: I0312 12:47:47.720432 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-external-api-0" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-httpd" containerID="cri-o://54414d412150bc428c6f0d18b48a1bd10be8e921597c44cbc9ca14d21b796c12" gracePeriod=30 Mar 12 12:47:47.775540 master-0 kubenswrapper[13984]: I0312 12:47:47.774977 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7897fc7475-gbjfd"] Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789111 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="413ffd1a-0d7d-4ee6-8868-1828490695d6" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789239 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="413ffd1a-0d7d-4ee6-8868-1828490695d6" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789274 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afbeb3b5-acd9-4095-bc17-29c1b6a29960" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789283 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="afbeb3b5-acd9-4095-bc17-29c1b6a29960" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789320 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89d18698-6635-4aba-a43e-eaf81cf72796" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789329 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="89d18698-6635-4aba-a43e-eaf81cf72796" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789347 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789356 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789394 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd44f3d-5677-4edd-8e29-8f010f3bfed1" containerName="ironic-inspector-db-sync" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789405 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd44f3d-5677-4edd-8e29-8f010f3bfed1" containerName="ironic-inspector-db-sync" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789437 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789444 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789486 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="init" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789494 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="init" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789507 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c00d2711-454c-4bc3-9db9-55fa8537ecad" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789515 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="c00d2711-454c-4bc3-9db9-55fa8537ecad" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789561 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789569 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789577 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cba6f06-5a3b-4ebc-b36b-779fb17b727b" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789583 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cba6f06-5a3b-4ebc-b36b-779fb17b727b" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789608 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-httpd" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789615 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-httpd" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789645 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api-log" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789652 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api-log" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: E0312 12:47:47.789665 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7180a2fd-8928-4b52-a707-a08eb7230a3b" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.789671 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="7180a2fd-8928-4b52-a707-a08eb7230a3b" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790068 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790105 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="89d18698-6635-4aba-a43e-eaf81cf72796" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790127 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790140 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="c00d2711-454c-4bc3-9db9-55fa8537ecad" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790152 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-httpd" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790179 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="64252643-1a6d-4283-a6be-bdb4adbac235" containerName="ironic-api-log" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790201 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cba6f06-5a3b-4ebc-b36b-779fb17b727b" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790214 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="7180a2fd-8928-4b52-a707-a08eb7230a3b" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790227 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="413ffd1a-0d7d-4ee6-8868-1828490695d6" containerName="mariadb-database-create" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790236 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="afbeb3b5-acd9-4095-bc17-29c1b6a29960" containerName="mariadb-account-create-update" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790273 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="91599efa-a74b-48bc-8553-58d2c33e1e75" containerName="neutron-api" Mar 12 12:47:47.791228 master-0 kubenswrapper[13984]: I0312 12:47:47.790295 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd44f3d-5677-4edd-8e29-8f010f3bfed1" containerName="ironic-inspector-db-sync" Mar 12 12:47:47.793357 master-0 kubenswrapper[13984]: I0312 12:47:47.792063 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.841659 master-0 kubenswrapper[13984]: I0312 12:47:47.838534 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7897fc7475-gbjfd"] Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.930008 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6dk2\" (UniqueName: \"kubernetes.io/projected/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-kube-api-access-w6dk2\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.930141 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-swift-storage-0\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.930470 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-svc\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.930544 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-sb\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.930591 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-config\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.930674 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-nb\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.938578 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cd96k"] Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.940929 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.944148 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 12:47:47.944604 master-0 kubenswrapper[13984]: I0312 12:47:47.944414 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 12 12:47:47.953606 master-0 kubenswrapper[13984]: I0312 12:47:47.950278 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cd96k"] Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032245 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-swift-storage-0\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032332 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-scripts\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032396 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-svc\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032445 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-sb\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032492 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-config\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032549 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032578 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-nb\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032595 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-config-data\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032649 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rksns\" (UniqueName: \"kubernetes.io/projected/fc8b81ae-1976-4abe-8d82-4decd232dd98-kube-api-access-rksns\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.033573 master-0 kubenswrapper[13984]: I0312 12:47:48.032742 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6dk2\" (UniqueName: \"kubernetes.io/projected/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-kube-api-access-w6dk2\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.035420 master-0 kubenswrapper[13984]: I0312 12:47:48.034263 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-config\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.035420 master-0 kubenswrapper[13984]: I0312 12:47:48.034865 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-svc\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.035538 master-0 kubenswrapper[13984]: I0312 12:47:48.035500 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-sb\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.037182 master-0 kubenswrapper[13984]: I0312 12:47:48.036021 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-nb\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.044029 master-0 kubenswrapper[13984]: I0312 12:47:48.042216 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-swift-storage-0\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.079522 master-0 kubenswrapper[13984]: I0312 12:47:48.078982 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:48.107983 master-0 kubenswrapper[13984]: I0312 12:47:48.106617 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 12 12:47:48.107983 master-0 kubenswrapper[13984]: I0312 12:47:48.098893 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6dk2\" (UniqueName: \"kubernetes.io/projected/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-kube-api-access-w6dk2\") pod \"dnsmasq-dns-7897fc7475-gbjfd\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.119502 master-0 kubenswrapper[13984]: I0312 12:47:48.113973 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 12 12:47:48.119502 master-0 kubenswrapper[13984]: I0312 12:47:48.114462 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 12 12:47:48.119502 master-0 kubenswrapper[13984]: I0312 12:47:48.114683 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142068 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142198 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/55ffff98-83d0-4d6c-829f-a35636db60fb-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142237 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-scripts\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142285 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142355 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142380 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-config-data\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142413 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfplw\" (UniqueName: \"kubernetes.io/projected/55ffff98-83d0-4d6c-829f-a35636db60fb-kube-api-access-mfplw\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142437 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-scripts\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142457 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rksns\" (UniqueName: \"kubernetes.io/projected/fc8b81ae-1976-4abe-8d82-4decd232dd98-kube-api-access-rksns\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142520 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-config\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.143628 master-0 kubenswrapper[13984]: I0312 12:47:48.142562 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.147797 master-0 kubenswrapper[13984]: I0312 12:47:48.147745 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-config-data\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.152149 master-0 kubenswrapper[13984]: I0312 12:47:48.151207 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.152334 master-0 kubenswrapper[13984]: I0312 12:47:48.152179 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-scripts\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.156520 master-0 kubenswrapper[13984]: I0312 12:47:48.152583 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:48.183520 master-0 kubenswrapper[13984]: I0312 12:47:48.180541 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rksns\" (UniqueName: \"kubernetes.io/projected/fc8b81ae-1976-4abe-8d82-4decd232dd98-kube-api-access-rksns\") pod \"nova-cell0-conductor-db-sync-cd96k\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.187516 master-0 kubenswrapper[13984]: I0312 12:47:48.186187 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:48.251085 master-0 kubenswrapper[13984]: I0312 12:47:48.250930 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfplw\" (UniqueName: \"kubernetes.io/projected/55ffff98-83d0-4d6c-829f-a35636db60fb-kube-api-access-mfplw\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.251085 master-0 kubenswrapper[13984]: I0312 12:47:48.251011 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-scripts\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.251085 master-0 kubenswrapper[13984]: I0312 12:47:48.251078 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-config\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.251423 master-0 kubenswrapper[13984]: I0312 12:47:48.251143 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.251423 master-0 kubenswrapper[13984]: I0312 12:47:48.251201 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.251423 master-0 kubenswrapper[13984]: I0312 12:47:48.251288 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/55ffff98-83d0-4d6c-829f-a35636db60fb-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.251423 master-0 kubenswrapper[13984]: I0312 12:47:48.251387 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.254277 master-0 kubenswrapper[13984]: I0312 12:47:48.252287 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.254277 master-0 kubenswrapper[13984]: I0312 12:47:48.252509 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.257505 master-0 kubenswrapper[13984]: I0312 12:47:48.256204 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-scripts\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.260356 master-0 kubenswrapper[13984]: I0312 12:47:48.258059 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/55ffff98-83d0-4d6c-829f-a35636db60fb-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.260356 master-0 kubenswrapper[13984]: I0312 12:47:48.258988 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.260356 master-0 kubenswrapper[13984]: I0312 12:47:48.259091 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-config\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.279202 master-0 kubenswrapper[13984]: I0312 12:47:48.278987 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfplw\" (UniqueName: \"kubernetes.io/projected/55ffff98-83d0-4d6c-829f-a35636db60fb-kube-api-access-mfplw\") pod \"ironic-inspector-0\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:48.316018 master-0 kubenswrapper[13984]: I0312 12:47:48.315960 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:47:48.379690 master-0 kubenswrapper[13984]: I0312 12:47:48.379624 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 12 12:47:49.230243 master-0 kubenswrapper[13984]: I0312 12:47:49.229828 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:49.234154 master-0 kubenswrapper[13984]: I0312 12:47:49.234113 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6bdc8876c8-5m8cl" Mar 12 12:47:49.482717 master-0 kubenswrapper[13984]: I0312 12:47:49.465925 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5cbb6f6f66-dz95j"] Mar 12 12:47:49.482717 master-0 kubenswrapper[13984]: I0312 12:47:49.466349 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5cbb6f6f66-dz95j" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-log" containerID="cri-o://20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f" gracePeriod=30 Mar 12 12:47:49.482717 master-0 kubenswrapper[13984]: I0312 12:47:49.472082 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5cbb6f6f66-dz95j" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-api" containerID="cri-o://4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae" gracePeriod=30 Mar 12 12:47:50.117426 master-0 kubenswrapper[13984]: I0312 12:47:50.117367 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:47:50.118125 master-0 kubenswrapper[13984]: I0312 12:47:50.118091 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-internal-api-0" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-log" containerID="cri-o://5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e" gracePeriod=30 Mar 12 12:47:50.118563 master-0 kubenswrapper[13984]: I0312 12:47:50.118524 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-f98a5-default-internal-api-0" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-httpd" containerID="cri-o://23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532" gracePeriod=30 Mar 12 12:47:50.810339 master-0 kubenswrapper[13984]: I0312 12:47:50.810269 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:51.383879 master-0 kubenswrapper[13984]: E0312 12:47:51.383526 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod920ddbc6_460a_455d_b6eb_8a9393dc169e.slice/crio-conmon-54414d412150bc428c6f0d18b48a1bd10be8e921597c44cbc9ca14d21b796c12.scope\": RecentStats: unable to find data in memory cache]" Mar 12 12:47:51.948877 master-0 kubenswrapper[13984]: I0312 12:47:51.947418 13984 generic.go:334] "Generic (PLEG): container finished" podID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerID="54414d412150bc428c6f0d18b48a1bd10be8e921597c44cbc9ca14d21b796c12" exitCode=0 Mar 12 12:47:51.948877 master-0 kubenswrapper[13984]: I0312 12:47:51.947453 13984 generic.go:334] "Generic (PLEG): container finished" podID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerID="c1921e384336ad0fddcfac6b8330b593a2d4a272e635610aa7d493e8bde23d84" exitCode=143 Mar 12 12:47:51.948877 master-0 kubenswrapper[13984]: I0312 12:47:51.947549 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"920ddbc6-460a-455d-b6eb-8a9393dc169e","Type":"ContainerDied","Data":"54414d412150bc428c6f0d18b48a1bd10be8e921597c44cbc9ca14d21b796c12"} Mar 12 12:47:51.948877 master-0 kubenswrapper[13984]: I0312 12:47:51.947577 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"920ddbc6-460a-455d-b6eb-8a9393dc169e","Type":"ContainerDied","Data":"c1921e384336ad0fddcfac6b8330b593a2d4a272e635610aa7d493e8bde23d84"} Mar 12 12:47:51.950583 master-0 kubenswrapper[13984]: I0312 12:47:51.949964 13984 generic.go:334] "Generic (PLEG): container finished" podID="efa87d2e-769b-4517-a700-73d24e233846" containerID="20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f" exitCode=143 Mar 12 12:47:51.950583 master-0 kubenswrapper[13984]: I0312 12:47:51.950008 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb6f6f66-dz95j" event={"ID":"efa87d2e-769b-4517-a700-73d24e233846","Type":"ContainerDied","Data":"20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f"} Mar 12 12:47:51.954335 master-0 kubenswrapper[13984]: I0312 12:47:51.954300 13984 generic.go:334] "Generic (PLEG): container finished" podID="4b525941-42e8-455b-b117-fae2d47b7977" containerID="5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e" exitCode=143 Mar 12 12:47:51.954414 master-0 kubenswrapper[13984]: I0312 12:47:51.954336 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"4b525941-42e8-455b-b117-fae2d47b7977","Type":"ContainerDied","Data":"5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e"} Mar 12 12:47:52.382687 master-0 kubenswrapper[13984]: I0312 12:47:52.382643 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.407886 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-scripts\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.407982 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-httpd-run\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.408017 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-logs\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.408828 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.408925 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4t2g\" (UniqueName: \"kubernetes.io/projected/920ddbc6-460a-455d-b6eb-8a9393dc169e-kube-api-access-k4t2g\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.409080 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-combined-ca-bundle\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.409111 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-public-tls-certs\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.410247 master-0 kubenswrapper[13984]: I0312 12:47:52.409341 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-config-data\") pod \"920ddbc6-460a-455d-b6eb-8a9393dc169e\" (UID: \"920ddbc6-460a-455d-b6eb-8a9393dc169e\") " Mar 12 12:47:52.421742 master-0 kubenswrapper[13984]: I0312 12:47:52.417018 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:52.421742 master-0 kubenswrapper[13984]: I0312 12:47:52.417404 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-logs" (OuterVolumeSpecName: "logs") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:52.456505 master-0 kubenswrapper[13984]: I0312 12:47:52.446325 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-scripts" (OuterVolumeSpecName: "scripts") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:52.459939 master-0 kubenswrapper[13984]: I0312 12:47:52.458754 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/920ddbc6-460a-455d-b6eb-8a9393dc169e-kube-api-access-k4t2g" (OuterVolumeSpecName: "kube-api-access-k4t2g") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "kube-api-access-k4t2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:52.495024 master-0 kubenswrapper[13984]: I0312 12:47:52.487613 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e" (OuterVolumeSpecName: "glance") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 12:47:52.516655 master-0 kubenswrapper[13984]: I0312 12:47:52.516597 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:52.516956 master-0 kubenswrapper[13984]: I0312 12:47:52.516733 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-config-data" (OuterVolumeSpecName: "config-data") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:52.518768 master-0 kubenswrapper[13984]: I0312 12:47:52.518721 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.518859 master-0 kubenswrapper[13984]: I0312 12:47:52.518772 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.518859 master-0 kubenswrapper[13984]: I0312 12:47:52.518784 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.518859 master-0 kubenswrapper[13984]: I0312 12:47:52.518794 13984 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.518859 master-0 kubenswrapper[13984]: I0312 12:47:52.518803 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/920ddbc6-460a-455d-b6eb-8a9393dc169e-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.518859 master-0 kubenswrapper[13984]: I0312 12:47:52.518855 13984 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") on node \"master-0\" " Mar 12 12:47:52.519120 master-0 kubenswrapper[13984]: I0312 12:47:52.518871 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4t2g\" (UniqueName: \"kubernetes.io/projected/920ddbc6-460a-455d-b6eb-8a9393dc169e-kube-api-access-k4t2g\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.575960 master-0 kubenswrapper[13984]: I0312 12:47:52.575543 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "920ddbc6-460a-455d-b6eb-8a9393dc169e" (UID: "920ddbc6-460a-455d-b6eb-8a9393dc169e"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:52.581074 master-0 kubenswrapper[13984]: I0312 12:47:52.581009 13984 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 12 12:47:52.581900 master-0 kubenswrapper[13984]: I0312 12:47:52.581869 13984 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383" (UniqueName: "kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e") on node "master-0" Mar 12 12:47:52.621022 master-0 kubenswrapper[13984]: I0312 12:47:52.620961 13984 reconciler_common.go:293] "Volume detached for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.621337 master-0 kubenswrapper[13984]: I0312 12:47:52.621325 13984 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/920ddbc6-460a-455d-b6eb-8a9393dc169e-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:52.715656 master-0 kubenswrapper[13984]: I0312 12:47:52.714715 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7897fc7475-gbjfd"] Mar 12 12:47:52.817047 master-0 kubenswrapper[13984]: I0312 12:47:52.808823 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cd96k"] Mar 12 12:47:52.818657 master-0 kubenswrapper[13984]: W0312 12:47:52.818618 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc8b81ae_1976_4abe_8d82_4decd232dd98.slice/crio-d3762c4c159ca286391eafba0e257db35b1805c173d75ad454a2525795db954f WatchSource:0}: Error finding container d3762c4c159ca286391eafba0e257db35b1805c173d75ad454a2525795db954f: Status 404 returned error can't find the container with id d3762c4c159ca286391eafba0e257db35b1805c173d75ad454a2525795db954f Mar 12 12:47:52.840570 master-0 kubenswrapper[13984]: W0312 12:47:52.840525 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55ffff98_83d0_4d6c_829f_a35636db60fb.slice/crio-c88abf30d2082a13f384d874bec58d8bfa9aac20a4f6732122f6a07f4ee4fbc9 WatchSource:0}: Error finding container c88abf30d2082a13f384d874bec58d8bfa9aac20a4f6732122f6a07f4ee4fbc9: Status 404 returned error can't find the container with id c88abf30d2082a13f384d874bec58d8bfa9aac20a4f6732122f6a07f4ee4fbc9 Mar 12 12:47:52.843060 master-0 kubenswrapper[13984]: I0312 12:47:52.843013 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:52.968005 master-0 kubenswrapper[13984]: I0312 12:47:52.967895 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"920ddbc6-460a-455d-b6eb-8a9393dc169e","Type":"ContainerDied","Data":"b67f129a214b57aa3a755f00afafecf41db3a50dc3fd55d84439913149e59cef"} Mar 12 12:47:52.968005 master-0 kubenswrapper[13984]: I0312 12:47:52.967969 13984 scope.go:117] "RemoveContainer" containerID="54414d412150bc428c6f0d18b48a1bd10be8e921597c44cbc9ca14d21b796c12" Mar 12 12:47:52.968492 master-0 kubenswrapper[13984]: I0312 12:47:52.968177 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:52.975003 master-0 kubenswrapper[13984]: I0312 12:47:52.974356 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"55ffff98-83d0-4d6c-829f-a35636db60fb","Type":"ContainerStarted","Data":"c88abf30d2082a13f384d874bec58d8bfa9aac20a4f6732122f6a07f4ee4fbc9"} Mar 12 12:47:52.977272 master-0 kubenswrapper[13984]: I0312 12:47:52.977235 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"aadfeb7c3c272ecb3417d38a31a0ae0c25d1b20ce600b2ad9ef70bf1e3df0fac"} Mar 12 12:47:52.982856 master-0 kubenswrapper[13984]: I0312 12:47:52.982800 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cd96k" event={"ID":"fc8b81ae-1976-4abe-8d82-4decd232dd98","Type":"ContainerStarted","Data":"d3762c4c159ca286391eafba0e257db35b1805c173d75ad454a2525795db954f"} Mar 12 12:47:52.985708 master-0 kubenswrapper[13984]: I0312 12:47:52.985672 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" event={"ID":"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545","Type":"ContainerStarted","Data":"f5ea50cb525fd351e84b1ea49512fe7712dda7f65b55251ad63872445842102b"} Mar 12 12:47:52.991860 master-0 kubenswrapper[13984]: I0312 12:47:52.991820 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-776949857-qhnzl" event={"ID":"4cecbed0-f638-495f-a450-ddcb64f6cc30","Type":"ContainerStarted","Data":"72d48f6e42f43ce263071185d3a1b1b5cc721c0fe540c7493a7f73cd3c2eb9da"} Mar 12 12:47:52.992941 master-0 kubenswrapper[13984]: I0312 12:47:52.992911 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:53.020006 master-0 kubenswrapper[13984]: I0312 12:47:53.019929 13984 scope.go:117] "RemoveContainer" containerID="c1921e384336ad0fddcfac6b8330b593a2d4a272e635610aa7d493e8bde23d84" Mar 12 12:47:53.073888 master-0 kubenswrapper[13984]: I0312 12:47:53.073738 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:47:53.088133 master-0 kubenswrapper[13984]: I0312 12:47:53.088040 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.268868 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: E0312 12:47:53.269350 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-httpd" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.269363 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-httpd" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: E0312 12:47:53.269418 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-log" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.269426 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-log" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.269652 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-log" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.269668 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" containerName="glance-httpd" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.270812 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.274601 master-0 kubenswrapper[13984]: I0312 12:47:53.273986 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 12 12:47:53.275290 master-0 kubenswrapper[13984]: I0312 12:47:53.275212 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-default-external-config-data" Mar 12 12:47:53.291631 master-0 kubenswrapper[13984]: I0312 12:47:53.290241 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:47:53.462082 master-0 kubenswrapper[13984]: I0312 12:47:53.462011 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.462440 master-0 kubenswrapper[13984]: I0312 12:47:53.462421 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.462607 master-0 kubenswrapper[13984]: I0312 12:47:53.462590 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.462755 master-0 kubenswrapper[13984]: I0312 12:47:53.462738 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.462860 master-0 kubenswrapper[13984]: I0312 12:47:53.462847 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ds8b\" (UniqueName: \"kubernetes.io/projected/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-kube-api-access-2ds8b\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.463026 master-0 kubenswrapper[13984]: I0312 12:47:53.463012 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.463115 master-0 kubenswrapper[13984]: I0312 12:47:53.463102 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-public-tls-certs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.463264 master-0 kubenswrapper[13984]: I0312 12:47:53.463251 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.529700 master-0 kubenswrapper[13984]: I0312 12:47:53.529647 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:47:53.574354 master-0 kubenswrapper[13984]: I0312 12:47:53.574300 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.574828 master-0 kubenswrapper[13984]: I0312 12:47:53.574807 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.574997 master-0 kubenswrapper[13984]: I0312 12:47:53.574983 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.575144 master-0 kubenswrapper[13984]: I0312 12:47:53.575113 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.575251 master-0 kubenswrapper[13984]: I0312 12:47:53.575238 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ds8b\" (UniqueName: \"kubernetes.io/projected/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-kube-api-access-2ds8b\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.575527 master-0 kubenswrapper[13984]: I0312 12:47:53.575508 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.575635 master-0 kubenswrapper[13984]: I0312 12:47:53.575623 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-public-tls-certs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.575829 master-0 kubenswrapper[13984]: I0312 12:47:53.575816 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.577850 master-0 kubenswrapper[13984]: I0312 12:47:53.577666 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-logs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.578124 master-0 kubenswrapper[13984]: I0312 12:47:53.577930 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-httpd-run\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.583166 master-0 kubenswrapper[13984]: I0312 12:47:53.583128 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-public-tls-certs\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.586505 master-0 kubenswrapper[13984]: I0312 12:47:53.584353 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-scripts\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.587357 master-0 kubenswrapper[13984]: I0312 12:47:53.587010 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:47:53.587357 master-0 kubenswrapper[13984]: I0312 12:47:53.587037 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/705cac2184d649d44bc54d9fe6c523322613cb4b44faad7af7e5abd2b4c3196c/globalmount\"" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.594520 master-0 kubenswrapper[13984]: I0312 12:47:53.594307 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-combined-ca-bundle\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.595839 master-0 kubenswrapper[13984]: I0312 12:47:53.595817 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-config-data\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.612367 master-0 kubenswrapper[13984]: I0312 12:47:53.612284 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ds8b\" (UniqueName: \"kubernetes.io/projected/dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910-kube-api-access-2ds8b\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680114 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-config-data\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680204 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnqqt\" (UniqueName: \"kubernetes.io/projected/efa87d2e-769b-4517-a700-73d24e233846-kube-api-access-xnqqt\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680417 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-scripts\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680446 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-internal-tls-certs\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680470 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-public-tls-certs\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680509 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-combined-ca-bundle\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.680536 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa87d2e-769b-4517-a700-73d24e233846-logs\") pod \"efa87d2e-769b-4517-a700-73d24e233846\" (UID: \"efa87d2e-769b-4517-a700-73d24e233846\") " Mar 12 12:47:53.687501 master-0 kubenswrapper[13984]: I0312 12:47:53.681300 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efa87d2e-769b-4517-a700-73d24e233846-logs" (OuterVolumeSpecName: "logs") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:53.702635 master-0 kubenswrapper[13984]: I0312 12:47:53.690508 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-scripts" (OuterVolumeSpecName: "scripts") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:53.702635 master-0 kubenswrapper[13984]: I0312 12:47:53.691454 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efa87d2e-769b-4517-a700-73d24e233846-kube-api-access-xnqqt" (OuterVolumeSpecName: "kube-api-access-xnqqt") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "kube-api-access-xnqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:53.769813 master-0 kubenswrapper[13984]: I0312 12:47:53.769694 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-config-data" (OuterVolumeSpecName: "config-data") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:53.789110 master-0 kubenswrapper[13984]: I0312 12:47:53.789055 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:53.789110 master-0 kubenswrapper[13984]: I0312 12:47:53.789105 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/efa87d2e-769b-4517-a700-73d24e233846-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:53.789793 master-0 kubenswrapper[13984]: I0312 12:47:53.789121 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:53.789793 master-0 kubenswrapper[13984]: I0312 12:47:53.789137 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnqqt\" (UniqueName: \"kubernetes.io/projected/efa87d2e-769b-4517-a700-73d24e233846-kube-api-access-xnqqt\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:53.857566 master-0 kubenswrapper[13984]: I0312 12:47:53.857016 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:53.859158 master-0 kubenswrapper[13984]: I0312 12:47:53.858536 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:53.893599 master-0 kubenswrapper[13984]: I0312 12:47:53.891936 13984 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:53.893599 master-0 kubenswrapper[13984]: I0312 12:47:53.891984 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:53.893830 master-0 kubenswrapper[13984]: I0312 12:47:53.893788 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "efa87d2e-769b-4517-a700-73d24e233846" (UID: "efa87d2e-769b-4517-a700-73d24e233846"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:53.976961 master-0 kubenswrapper[13984]: I0312 12:47:53.976919 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:53.997465 master-0 kubenswrapper[13984]: I0312 12:47:53.997395 13984 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/efa87d2e-769b-4517-a700-73d24e233846-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.027236 master-0 kubenswrapper[13984]: I0312 12:47:54.025906 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="920ddbc6-460a-455d-b6eb-8a9393dc169e" path="/var/lib/kubelet/pods/920ddbc6-460a-455d-b6eb-8a9393dc169e/volumes" Mar 12 12:47:54.040521 master-0 kubenswrapper[13984]: I0312 12:47:54.039577 13984 generic.go:334] "Generic (PLEG): container finished" podID="4b525941-42e8-455b-b117-fae2d47b7977" containerID="23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532" exitCode=0 Mar 12 12:47:54.040521 master-0 kubenswrapper[13984]: I0312 12:47:54.039644 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"4b525941-42e8-455b-b117-fae2d47b7977","Type":"ContainerDied","Data":"23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532"} Mar 12 12:47:54.040521 master-0 kubenswrapper[13984]: I0312 12:47:54.039670 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"4b525941-42e8-455b-b117-fae2d47b7977","Type":"ContainerDied","Data":"ebab59b5ea049c72253b7bbe7432951627489d78e0bcb1162b8c4ac28cd3f2e4"} Mar 12 12:47:54.040521 master-0 kubenswrapper[13984]: I0312 12:47:54.039690 13984 scope.go:117] "RemoveContainer" containerID="23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532" Mar 12 12:47:54.040521 master-0 kubenswrapper[13984]: I0312 12:47:54.039776 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.078164 master-0 kubenswrapper[13984]: I0312 12:47:54.078090 13984 generic.go:334] "Generic (PLEG): container finished" podID="55ffff98-83d0-4d6c-829f-a35636db60fb" containerID="fef25b4a94188b40d32e36dfc92c958533c27227a420b4c7d79624434b529ef9" exitCode=0 Mar 12 12:47:54.078370 master-0 kubenswrapper[13984]: I0312 12:47:54.078250 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"55ffff98-83d0-4d6c-829f-a35636db60fb","Type":"ContainerDied","Data":"fef25b4a94188b40d32e36dfc92c958533c27227a420b4c7d79624434b529ef9"} Mar 12 12:47:54.088996 master-0 kubenswrapper[13984]: I0312 12:47:54.088443 13984 generic.go:334] "Generic (PLEG): container finished" podID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerID="9664021e473026c9b519516a59fb0bbcbc39bddc0af1dd2934f2954f6177adf2" exitCode=0 Mar 12 12:47:54.088996 master-0 kubenswrapper[13984]: I0312 12:47:54.088568 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" event={"ID":"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545","Type":"ContainerDied","Data":"9664021e473026c9b519516a59fb0bbcbc39bddc0af1dd2934f2954f6177adf2"} Mar 12 12:47:54.099821 master-0 kubenswrapper[13984]: I0312 12:47:54.099361 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-combined-ca-bundle\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.099821 master-0 kubenswrapper[13984]: I0312 12:47:54.099466 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kk7d\" (UniqueName: \"kubernetes.io/projected/4b525941-42e8-455b-b117-fae2d47b7977-kube-api-access-8kk7d\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.099821 master-0 kubenswrapper[13984]: I0312 12:47:54.099566 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-httpd-run\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.099821 master-0 kubenswrapper[13984]: I0312 12:47:54.099735 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-scripts\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.117272 master-0 kubenswrapper[13984]: I0312 12:47:54.110194 13984 scope.go:117] "RemoveContainer" containerID="5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e" Mar 12 12:47:54.119180 master-0 kubenswrapper[13984]: I0312 12:47:54.118427 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:54.125334 master-0 kubenswrapper[13984]: I0312 12:47:54.125171 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-scripts" (OuterVolumeSpecName: "scripts") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:54.133581 master-0 kubenswrapper[13984]: I0312 12:47:54.127882 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-internal-tls-certs\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.133581 master-0 kubenswrapper[13984]: I0312 12:47:54.127975 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-config-data\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.136463 master-0 kubenswrapper[13984]: I0312 12:47:54.134036 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.136463 master-0 kubenswrapper[13984]: I0312 12:47:54.134092 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-logs\") pod \"4b525941-42e8-455b-b117-fae2d47b7977\" (UID: \"4b525941-42e8-455b-b117-fae2d47b7977\") " Mar 12 12:47:54.136463 master-0 kubenswrapper[13984]: I0312 12:47:54.134532 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-logs" (OuterVolumeSpecName: "logs") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:54.136463 master-0 kubenswrapper[13984]: I0312 12:47:54.135332 13984 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.136463 master-0 kubenswrapper[13984]: I0312 12:47:54.135355 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.136463 master-0 kubenswrapper[13984]: I0312 12:47:54.135368 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b525941-42e8-455b-b117-fae2d47b7977-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.151571 master-0 kubenswrapper[13984]: I0312 12:47:54.141257 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b525941-42e8-455b-b117-fae2d47b7977-kube-api-access-8kk7d" (OuterVolumeSpecName: "kube-api-access-8kk7d") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "kube-api-access-8kk7d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:54.158336 master-0 kubenswrapper[13984]: I0312 12:47:54.154954 13984 generic.go:334] "Generic (PLEG): container finished" podID="efa87d2e-769b-4517-a700-73d24e233846" containerID="4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae" exitCode=0 Mar 12 12:47:54.158336 master-0 kubenswrapper[13984]: I0312 12:47:54.156421 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5cbb6f6f66-dz95j" Mar 12 12:47:54.158336 master-0 kubenswrapper[13984]: I0312 12:47:54.157122 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb6f6f66-dz95j" event={"ID":"efa87d2e-769b-4517-a700-73d24e233846","Type":"ContainerDied","Data":"4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae"} Mar 12 12:47:54.158336 master-0 kubenswrapper[13984]: I0312 12:47:54.157156 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5cbb6f6f66-dz95j" event={"ID":"efa87d2e-769b-4517-a700-73d24e233846","Type":"ContainerDied","Data":"47ce580d81fbd81c5739b622e0cc01c252f290ff92c766dd388f83489dd36010"} Mar 12 12:47:54.218268 master-0 kubenswrapper[13984]: I0312 12:47:54.218217 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:54.230535 master-0 kubenswrapper[13984]: I0312 12:47:54.230470 13984 scope.go:117] "RemoveContainer" containerID="23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532" Mar 12 12:47:54.231630 master-0 kubenswrapper[13984]: E0312 12:47:54.231553 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532\": container with ID starting with 23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532 not found: ID does not exist" containerID="23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532" Mar 12 12:47:54.231715 master-0 kubenswrapper[13984]: I0312 12:47:54.231651 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532"} err="failed to get container status \"23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532\": rpc error: code = NotFound desc = could not find container \"23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532\": container with ID starting with 23355f8bf1f162492923545fd0963a45f2b0f491f81e1413a3f46d8ac9ebd532 not found: ID does not exist" Mar 12 12:47:54.231715 master-0 kubenswrapper[13984]: I0312 12:47:54.231684 13984 scope.go:117] "RemoveContainer" containerID="5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e" Mar 12 12:47:54.234134 master-0 kubenswrapper[13984]: I0312 12:47:54.234074 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5cbb6f6f66-dz95j"] Mar 12 12:47:54.237602 master-0 kubenswrapper[13984]: E0312 12:47:54.236938 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e\": container with ID starting with 5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e not found: ID does not exist" containerID="5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e" Mar 12 12:47:54.237602 master-0 kubenswrapper[13984]: I0312 12:47:54.236984 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e"} err="failed to get container status \"5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e\": rpc error: code = NotFound desc = could not find container \"5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e\": container with ID starting with 5fd3a37208fdb6babe6410af7c50f5d99bb54ed64dc33306740ef11f14e6e77e not found: ID does not exist" Mar 12 12:47:54.237602 master-0 kubenswrapper[13984]: I0312 12:47:54.237013 13984 scope.go:117] "RemoveContainer" containerID="4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae" Mar 12 12:47:54.238109 master-0 kubenswrapper[13984]: I0312 12:47:54.238083 13984 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.239229 master-0 kubenswrapper[13984]: I0312 12:47:54.239197 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kk7d\" (UniqueName: \"kubernetes.io/projected/4b525941-42e8-455b-b117-fae2d47b7977-kube-api-access-8kk7d\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.254779 master-0 kubenswrapper[13984]: I0312 12:47:54.254690 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5cbb6f6f66-dz95j"] Mar 12 12:47:54.268999 master-0 kubenswrapper[13984]: I0312 12:47:54.268939 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-config-data" (OuterVolumeSpecName: "config-data") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:54.277644 master-0 kubenswrapper[13984]: I0312 12:47:54.276717 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:54.297958 master-0 kubenswrapper[13984]: I0312 12:47:54.297911 13984 scope.go:117] "RemoveContainer" containerID="20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f" Mar 12 12:47:54.336631 master-0 kubenswrapper[13984]: I0312 12:47:54.336559 13984 scope.go:117] "RemoveContainer" containerID="4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae" Mar 12 12:47:54.341202 master-0 kubenswrapper[13984]: E0312 12:47:54.341152 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae\": container with ID starting with 4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae not found: ID does not exist" containerID="4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae" Mar 12 12:47:54.341333 master-0 kubenswrapper[13984]: I0312 12:47:54.341201 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae"} err="failed to get container status \"4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae\": rpc error: code = NotFound desc = could not find container \"4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae\": container with ID starting with 4fb07447c20dd7ee5611587fa743314d8e728c6f1fe8231d4f576bcf83bb02ae not found: ID does not exist" Mar 12 12:47:54.341333 master-0 kubenswrapper[13984]: I0312 12:47:54.341226 13984 scope.go:117] "RemoveContainer" containerID="20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f" Mar 12 12:47:54.345545 master-0 kubenswrapper[13984]: I0312 12:47:54.341951 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.345545 master-0 kubenswrapper[13984]: E0312 12:47:54.342332 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f\": container with ID starting with 20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f not found: ID does not exist" containerID="20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f" Mar 12 12:47:54.345545 master-0 kubenswrapper[13984]: I0312 12:47:54.342352 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f"} err="failed to get container status \"20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f\": rpc error: code = NotFound desc = could not find container \"20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f\": container with ID starting with 20e6c1dedcabe51bf1e4491111312941c6bc97e9fd4de27729cf45f6a248c21f not found: ID does not exist" Mar 12 12:47:54.345545 master-0 kubenswrapper[13984]: I0312 12:47:54.342419 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b525941-42e8-455b-b117-fae2d47b7977-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.457119 master-0 kubenswrapper[13984]: I0312 12:47:54.457064 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616" (OuterVolumeSpecName: "glance") pod "4b525941-42e8-455b-b117-fae2d47b7977" (UID: "4b525941-42e8-455b-b117-fae2d47b7977"). InnerVolumeSpecName "pvc-49a24b6d-1684-4129-8811-b908648797de". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 12 12:47:54.476875 master-0 kubenswrapper[13984]: I0312 12:47:54.475112 13984 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") on node \"master-0\" " Mar 12 12:47:54.507631 master-0 kubenswrapper[13984]: I0312 12:47:54.507468 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3eb07d51-f37c-4dcd-9550-f6a0d58c8383\" (UniqueName: \"kubernetes.io/csi/topolvm.io^704a2c64-c198-43c9-b817-528252118d2e\") pod \"glance-f98a5-default-external-api-0\" (UID: \"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910\") " pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:54.507844 master-0 kubenswrapper[13984]: I0312 12:47:54.507776 13984 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 12 12:47:54.508264 master-0 kubenswrapper[13984]: I0312 12:47:54.507991 13984 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-49a24b6d-1684-4129-8811-b908648797de" (UniqueName: "kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616") on node "master-0" Mar 12 12:47:54.586549 master-0 kubenswrapper[13984]: I0312 12:47:54.586367 13984 reconciler_common.go:293] "Volume detached for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:54.605123 master-0 kubenswrapper[13984]: I0312 12:47:54.605073 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:47:54.734563 master-0 kubenswrapper[13984]: I0312 12:47:54.733729 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:47:54.811174 master-0 kubenswrapper[13984]: I0312 12:47:54.811099 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:47:54.852695 master-0 kubenswrapper[13984]: I0312 12:47:54.852169 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:47:54.852886 master-0 kubenswrapper[13984]: E0312 12:47:54.852787 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-log" Mar 12 12:47:54.852886 master-0 kubenswrapper[13984]: I0312 12:47:54.852805 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-log" Mar 12 12:47:54.852886 master-0 kubenswrapper[13984]: E0312 12:47:54.852824 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-httpd" Mar 12 12:47:54.852886 master-0 kubenswrapper[13984]: I0312 12:47:54.852831 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-httpd" Mar 12 12:47:54.852886 master-0 kubenswrapper[13984]: E0312 12:47:54.852861 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-api" Mar 12 12:47:54.852886 master-0 kubenswrapper[13984]: I0312 12:47:54.852869 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-api" Mar 12 12:47:54.853085 master-0 kubenswrapper[13984]: E0312 12:47:54.852913 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-log" Mar 12 12:47:54.853085 master-0 kubenswrapper[13984]: I0312 12:47:54.852923 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-log" Mar 12 12:47:54.853367 master-0 kubenswrapper[13984]: I0312 12:47:54.853191 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-httpd" Mar 12 12:47:54.853367 master-0 kubenswrapper[13984]: I0312 12:47:54.853224 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-api" Mar 12 12:47:54.853367 master-0 kubenswrapper[13984]: I0312 12:47:54.853251 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="efa87d2e-769b-4517-a700-73d24e233846" containerName="placement-log" Mar 12 12:47:54.853367 master-0 kubenswrapper[13984]: I0312 12:47:54.853281 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b525941-42e8-455b-b117-fae2d47b7977" containerName="glance-log" Mar 12 12:47:54.854776 master-0 kubenswrapper[13984]: I0312 12:47:54.854742 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.857245 master-0 kubenswrapper[13984]: I0312 12:47:54.857202 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 12 12:47:54.866918 master-0 kubenswrapper[13984]: I0312 12:47:54.866867 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-f98a5-default-internal-config-data" Mar 12 12:47:54.882510 master-0 kubenswrapper[13984]: I0312 12:47:54.869696 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 12 12:47:54.882510 master-0 kubenswrapper[13984]: I0312 12:47:54.870138 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:47:54.906416 master-0 kubenswrapper[13984]: I0312 12:47:54.906363 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.906742 master-0 kubenswrapper[13984]: I0312 12:47:54.906630 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/55ffff98-83d0-4d6c-829f-a35636db60fb-etc-podinfo\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.906805 master-0 kubenswrapper[13984]: I0312 12:47:54.906718 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-combined-ca-bundle\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.906805 master-0 kubenswrapper[13984]: I0312 12:47:54.906789 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.906898 master-0 kubenswrapper[13984]: I0312 12:47:54.906860 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfplw\" (UniqueName: \"kubernetes.io/projected/55ffff98-83d0-4d6c-829f-a35636db60fb-kube-api-access-mfplw\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.906980 master-0 kubenswrapper[13984]: I0312 12:47:54.906951 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-config\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.907061 master-0 kubenswrapper[13984]: I0312 12:47:54.907037 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-scripts\") pod \"55ffff98-83d0-4d6c-829f-a35636db60fb\" (UID: \"55ffff98-83d0-4d6c-829f-a35636db60fb\") " Mar 12 12:47:54.907449 master-0 kubenswrapper[13984]: I0312 12:47:54.907414 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907554 master-0 kubenswrapper[13984]: I0312 12:47:54.907501 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626c2e41-499c-4882-88e8-7d64d647be9e-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907606 master-0 kubenswrapper[13984]: I0312 12:47:54.907585 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907666 master-0 kubenswrapper[13984]: I0312 12:47:54.907643 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907722 master-0 kubenswrapper[13984]: I0312 12:47:54.907696 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626c2e41-499c-4882-88e8-7d64d647be9e-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907774 master-0 kubenswrapper[13984]: I0312 12:47:54.907728 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907866 master-0 kubenswrapper[13984]: I0312 12:47:54.907831 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mkml\" (UniqueName: \"kubernetes.io/projected/626c2e41-499c-4882-88e8-7d64d647be9e-kube-api-access-8mkml\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.907930 master-0 kubenswrapper[13984]: I0312 12:47:54.907900 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-internal-tls-certs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:54.908913 master-0 kubenswrapper[13984]: I0312 12:47:54.908889 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:54.910096 master-0 kubenswrapper[13984]: I0312 12:47:54.909280 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:47:54.916171 master-0 kubenswrapper[13984]: I0312 12:47:54.915027 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-config" (OuterVolumeSpecName: "config") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:54.916550 master-0 kubenswrapper[13984]: I0312 12:47:54.916508 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/55ffff98-83d0-4d6c-829f-a35636db60fb-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 12 12:47:54.922727 master-0 kubenswrapper[13984]: I0312 12:47:54.921618 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-scripts" (OuterVolumeSpecName: "scripts") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:54.931571 master-0 kubenswrapper[13984]: I0312 12:47:54.930038 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ffff98-83d0-4d6c-829f-a35636db60fb-kube-api-access-mfplw" (OuterVolumeSpecName: "kube-api-access-mfplw") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "kube-api-access-mfplw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:47:54.981973 master-0 kubenswrapper[13984]: I0312 12:47:54.981877 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55ffff98-83d0-4d6c-829f-a35636db60fb" (UID: "55ffff98-83d0-4d6c-829f-a35636db60fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:47:55.010046 master-0 kubenswrapper[13984]: I0312 12:47:55.009977 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010046 master-0 kubenswrapper[13984]: I0312 12:47:55.010049 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626c2e41-499c-4882-88e8-7d64d647be9e-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010327 master-0 kubenswrapper[13984]: I0312 12:47:55.010074 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010327 master-0 kubenswrapper[13984]: I0312 12:47:55.010148 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mkml\" (UniqueName: \"kubernetes.io/projected/626c2e41-499c-4882-88e8-7d64d647be9e-kube-api-access-8mkml\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010327 master-0 kubenswrapper[13984]: I0312 12:47:55.010188 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-internal-tls-certs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010327 master-0 kubenswrapper[13984]: I0312 12:47:55.010252 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010327 master-0 kubenswrapper[13984]: I0312 12:47:55.010273 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626c2e41-499c-4882-88e8-7d64d647be9e-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010327 master-0 kubenswrapper[13984]: I0312 12:47:55.010317 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010370 13984 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010383 13984 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/55ffff98-83d0-4d6c-829f-a35636db60fb-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010392 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010404 13984 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/55ffff98-83d0-4d6c-829f-a35636db60fb-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010414 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfplw\" (UniqueName: \"kubernetes.io/projected/55ffff98-83d0-4d6c-829f-a35636db60fb-kube-api-access-mfplw\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010426 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.010658 master-0 kubenswrapper[13984]: I0312 12:47:55.010434 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/55ffff98-83d0-4d6c-829f-a35636db60fb-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:47:55.013288 master-0 kubenswrapper[13984]: I0312 12:47:55.013256 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-scripts\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.015500 master-0 kubenswrapper[13984]: I0312 12:47:55.015068 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-internal-tls-certs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.015500 master-0 kubenswrapper[13984]: I0312 12:47:55.015169 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/626c2e41-499c-4882-88e8-7d64d647be9e-logs\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.016905 master-0 kubenswrapper[13984]: I0312 12:47:55.016012 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/626c2e41-499c-4882-88e8-7d64d647be9e-httpd-run\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.016905 master-0 kubenswrapper[13984]: I0312 12:47:55.016299 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-config-data\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.018085 master-0 kubenswrapper[13984]: I0312 12:47:55.017966 13984 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 12 12:47:55.018085 master-0 kubenswrapper[13984]: I0312 12:47:55.018003 13984 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/95c4692dce0b9808ca42bee34ff116315d549daf622dbdf12b07464f0e4f3ac4/globalmount\"" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.020607 master-0 kubenswrapper[13984]: I0312 12:47:55.020553 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/626c2e41-499c-4882-88e8-7d64d647be9e-combined-ca-bundle\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.032139 master-0 kubenswrapper[13984]: I0312 12:47:55.032103 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mkml\" (UniqueName: \"kubernetes.io/projected/626c2e41-499c-4882-88e8-7d64d647be9e-kube-api-access-8mkml\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:55.157172 master-0 kubenswrapper[13984]: I0312 12:47:55.156764 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-776949857-qhnzl" Mar 12 12:47:55.244268 master-0 kubenswrapper[13984]: I0312 12:47:55.235047 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"55ffff98-83d0-4d6c-829f-a35636db60fb","Type":"ContainerDied","Data":"c88abf30d2082a13f384d874bec58d8bfa9aac20a4f6732122f6a07f4ee4fbc9"} Mar 12 12:47:55.244268 master-0 kubenswrapper[13984]: I0312 12:47:55.235105 13984 scope.go:117] "RemoveContainer" containerID="fef25b4a94188b40d32e36dfc92c958533c27227a420b4c7d79624434b529ef9" Mar 12 12:47:55.244268 master-0 kubenswrapper[13984]: I0312 12:47:55.235241 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 12 12:47:55.244268 master-0 kubenswrapper[13984]: I0312 12:47:55.240978 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" event={"ID":"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545","Type":"ContainerStarted","Data":"e8013f87d8357a28b6f88b55d2368b7b95eee75281404addd1a8208622a8c01e"} Mar 12 12:47:55.244268 master-0 kubenswrapper[13984]: I0312 12:47:55.241324 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:47:55.289733 master-0 kubenswrapper[13984]: I0312 12:47:55.288929 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-external-api-0"] Mar 12 12:47:55.304982 master-0 kubenswrapper[13984]: I0312 12:47:55.298754 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" podStartSLOduration=8.298730885 podStartE2EDuration="8.298730885s" podCreationTimestamp="2026-03-12 12:47:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:55.277121861 +0000 UTC m=+1407.475137373" watchObservedRunningTime="2026-03-12 12:47:55.298730885 +0000 UTC m=+1407.496746377" Mar 12 12:47:55.439644 master-0 kubenswrapper[13984]: I0312 12:47:55.439536 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:55.498595 master-0 kubenswrapper[13984]: I0312 12:47:55.493139 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:55.519496 master-0 kubenswrapper[13984]: I0312 12:47:55.517049 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:55.519496 master-0 kubenswrapper[13984]: E0312 12:47:55.517688 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ffff98-83d0-4d6c-829f-a35636db60fb" containerName="ironic-python-agent-init" Mar 12 12:47:55.519496 master-0 kubenswrapper[13984]: I0312 12:47:55.517711 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ffff98-83d0-4d6c-829f-a35636db60fb" containerName="ironic-python-agent-init" Mar 12 12:47:55.519496 master-0 kubenswrapper[13984]: I0312 12:47:55.518028 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ffff98-83d0-4d6c-829f-a35636db60fb" containerName="ironic-python-agent-init" Mar 12 12:47:55.528511 master-0 kubenswrapper[13984]: I0312 12:47:55.528314 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 12 12:47:55.532514 master-0 kubenswrapper[13984]: I0312 12:47:55.531012 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 12 12:47:55.532514 master-0 kubenswrapper[13984]: I0312 12:47:55.531720 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 12 12:47:55.532759 master-0 kubenswrapper[13984]: I0312 12:47:55.532662 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 12 12:47:55.536508 master-0 kubenswrapper[13984]: I0312 12:47:55.532861 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 12 12:47:55.536508 master-0 kubenswrapper[13984]: I0312 12:47:55.533053 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 12 12:47:55.564497 master-0 kubenswrapper[13984]: I0312 12:47:55.562194 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625592 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/61638247-2447-4459-8be8-f91c2e718b46-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625707 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/61638247-2447-4459-8be8-f91c2e718b46-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625757 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-config\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625819 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625841 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-scripts\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625855 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625909 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.625974 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/61638247-2447-4459-8be8-f91c2e718b46-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.626079 master-0 kubenswrapper[13984]: I0312 12:47:55.626008 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspmj\" (UniqueName: \"kubernetes.io/projected/61638247-2447-4459-8be8-f91c2e718b46-kube-api-access-vspmj\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.727618 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.727725 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/61638247-2447-4459-8be8-f91c2e718b46-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.727766 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspmj\" (UniqueName: \"kubernetes.io/projected/61638247-2447-4459-8be8-f91c2e718b46-kube-api-access-vspmj\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.727830 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/61638247-2447-4459-8be8-f91c2e718b46-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.727920 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/61638247-2447-4459-8be8-f91c2e718b46-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.727966 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-config\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.728028 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.728053 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-scripts\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.728702 master-0 kubenswrapper[13984]: I0312 12:47:55.728074 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.729694 master-0 kubenswrapper[13984]: I0312 12:47:55.729336 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/61638247-2447-4459-8be8-f91c2e718b46-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.738517 master-0 kubenswrapper[13984]: I0312 12:47:55.737877 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.738517 master-0 kubenswrapper[13984]: I0312 12:47:55.737987 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-scripts\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.738517 master-0 kubenswrapper[13984]: I0312 12:47:55.738217 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/61638247-2447-4459-8be8-f91c2e718b46-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.739766 master-0 kubenswrapper[13984]: I0312 12:47:55.739692 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-config\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.742823 master-0 kubenswrapper[13984]: I0312 12:47:55.742788 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.743299 master-0 kubenswrapper[13984]: I0312 12:47:55.743272 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61638247-2447-4459-8be8-f91c2e718b46-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.748244 master-0 kubenswrapper[13984]: I0312 12:47:55.748196 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/61638247-2447-4459-8be8-f91c2e718b46-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.754152 master-0 kubenswrapper[13984]: I0312 12:47:55.754117 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspmj\" (UniqueName: \"kubernetes.io/projected/61638247-2447-4459-8be8-f91c2e718b46-kube-api-access-vspmj\") pod \"ironic-inspector-0\" (UID: \"61638247-2447-4459-8be8-f91c2e718b46\") " pod="openstack/ironic-inspector-0" Mar 12 12:47:55.851454 master-0 kubenswrapper[13984]: I0312 12:47:55.851353 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 12 12:47:55.918421 master-0 kubenswrapper[13984]: I0312 12:47:55.916036 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-49a24b6d-1684-4129-8811-b908648797de\" (UniqueName: \"kubernetes.io/csi/topolvm.io^933ed12f-3164-4415-bd61-6144d9ebb616\") pod \"glance-f98a5-default-internal-api-0\" (UID: \"626c2e41-499c-4882-88e8-7d64d647be9e\") " pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:56.025092 master-0 kubenswrapper[13984]: I0312 12:47:56.025037 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b525941-42e8-455b-b117-fae2d47b7977" path="/var/lib/kubelet/pods/4b525941-42e8-455b-b117-fae2d47b7977/volumes" Mar 12 12:47:56.027896 master-0 kubenswrapper[13984]: I0312 12:47:56.025928 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ffff98-83d0-4d6c-829f-a35636db60fb" path="/var/lib/kubelet/pods/55ffff98-83d0-4d6c-829f-a35636db60fb/volumes" Mar 12 12:47:56.027896 master-0 kubenswrapper[13984]: I0312 12:47:56.026749 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efa87d2e-769b-4517-a700-73d24e233846" path="/var/lib/kubelet/pods/efa87d2e-769b-4517-a700-73d24e233846/volumes" Mar 12 12:47:56.118512 master-0 kubenswrapper[13984]: I0312 12:47:56.118448 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:47:56.318254 master-0 kubenswrapper[13984]: I0312 12:47:56.318188 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910","Type":"ContainerStarted","Data":"7d53df34205e10fa4289b104c333081676e60dc11c6555aada47112bdcb8a270"} Mar 12 12:47:56.567596 master-0 kubenswrapper[13984]: W0312 12:47:56.567404 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61638247_2447_4459_8be8_f91c2e718b46.slice/crio-1512ef56f05d8b7b9209942793c892804a30105a8e164a71a4a454a4c3ae564f WatchSource:0}: Error finding container 1512ef56f05d8b7b9209942793c892804a30105a8e164a71a4a454a4c3ae564f: Status 404 returned error can't find the container with id 1512ef56f05d8b7b9209942793c892804a30105a8e164a71a4a454a4c3ae564f Mar 12 12:47:56.585524 master-0 kubenswrapper[13984]: I0312 12:47:56.581662 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 12 12:47:56.847467 master-0 kubenswrapper[13984]: W0312 12:47:56.846414 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod626c2e41_499c_4882_88e8_7d64d647be9e.slice/crio-cea620b187bb0e97e40a915144d617db425d910d8d957cf2571fd2c4e59b2c6a WatchSource:0}: Error finding container cea620b187bb0e97e40a915144d617db425d910d8d957cf2571fd2c4e59b2c6a: Status 404 returned error can't find the container with id cea620b187bb0e97e40a915144d617db425d910d8d957cf2571fd2c4e59b2c6a Mar 12 12:47:56.848557 master-0 kubenswrapper[13984]: I0312 12:47:56.847796 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-f98a5-default-internal-api-0"] Mar 12 12:47:57.424509 master-0 kubenswrapper[13984]: I0312 12:47:57.419389 13984 generic.go:334] "Generic (PLEG): container finished" podID="61638247-2447-4459-8be8-f91c2e718b46" containerID="51e0cae76b969b3d35487bc10ec695fff1ef6c8fc038f67afd538f0b80e74513" exitCode=0 Mar 12 12:47:57.424509 master-0 kubenswrapper[13984]: I0312 12:47:57.419525 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerDied","Data":"51e0cae76b969b3d35487bc10ec695fff1ef6c8fc038f67afd538f0b80e74513"} Mar 12 12:47:57.424509 master-0 kubenswrapper[13984]: I0312 12:47:57.419561 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"1512ef56f05d8b7b9209942793c892804a30105a8e164a71a4a454a4c3ae564f"} Mar 12 12:47:57.461288 master-0 kubenswrapper[13984]: I0312 12:47:57.459803 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910","Type":"ContainerStarted","Data":"b99cdb88984392f46398723b6b1cc3ac77b729e223f86a518edcbf8a2973b4b4"} Mar 12 12:47:57.467502 master-0 kubenswrapper[13984]: I0312 12:47:57.466273 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"626c2e41-499c-4882-88e8-7d64d647be9e","Type":"ContainerStarted","Data":"cea620b187bb0e97e40a915144d617db425d910d8d957cf2571fd2c4e59b2c6a"} Mar 12 12:47:58.480305 master-0 kubenswrapper[13984]: I0312 12:47:58.479976 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-external-api-0" event={"ID":"dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910","Type":"ContainerStarted","Data":"9ba276538bc14863aeaf050f5c3b5553f708f09878e0dc6f1aa7f839e722fdad"} Mar 12 12:47:58.484436 master-0 kubenswrapper[13984]: I0312 12:47:58.484329 13984 generic.go:334] "Generic (PLEG): container finished" podID="43e6fb75-b813-4074-8029-d6817b1bb9e2" containerID="aadfeb7c3c272ecb3417d38a31a0ae0c25d1b20ce600b2ad9ef70bf1e3df0fac" exitCode=0 Mar 12 12:47:58.484436 master-0 kubenswrapper[13984]: I0312 12:47:58.484419 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerDied","Data":"aadfeb7c3c272ecb3417d38a31a0ae0c25d1b20ce600b2ad9ef70bf1e3df0fac"} Mar 12 12:47:58.488678 master-0 kubenswrapper[13984]: I0312 12:47:58.488630 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"626c2e41-499c-4882-88e8-7d64d647be9e","Type":"ContainerStarted","Data":"308b996892d1be89195c41a023cb0b6c4979f31873207fa3ffad01a13fe55248"} Mar 12 12:47:58.488844 master-0 kubenswrapper[13984]: I0312 12:47:58.488684 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-f98a5-default-internal-api-0" event={"ID":"626c2e41-499c-4882-88e8-7d64d647be9e","Type":"ContainerStarted","Data":"4b1d7b75c8b529578c1282d08df6c938075cbe95ccea65ef1ed278bf3ecd4fe6"} Mar 12 12:47:58.797237 master-0 kubenswrapper[13984]: I0312 12:47:58.796429 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f98a5-default-external-api-0" podStartSLOduration=5.796407419 podStartE2EDuration="5.796407419s" podCreationTimestamp="2026-03-12 12:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:58.781784699 +0000 UTC m=+1410.979800221" watchObservedRunningTime="2026-03-12 12:47:58.796407419 +0000 UTC m=+1410.994422911" Mar 12 12:47:59.550243 master-0 kubenswrapper[13984]: I0312 12:47:59.549904 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-f98a5-default-internal-api-0" podStartSLOduration=5.549880353 podStartE2EDuration="5.549880353s" podCreationTimestamp="2026-03-12 12:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:47:59.534904427 +0000 UTC m=+1411.732919919" watchObservedRunningTime="2026-03-12 12:47:59.549880353 +0000 UTC m=+1411.747895835" Mar 12 12:48:03.188713 master-0 kubenswrapper[13984]: I0312 12:48:03.188664 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:48:03.278469 master-0 kubenswrapper[13984]: I0312 12:48:03.278397 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666f5ccff9-kdhq6"] Mar 12 12:48:03.278722 master-0 kubenswrapper[13984]: I0312 12:48:03.278685 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerName="dnsmasq-dns" containerID="cri-o://19e0c2ceb1955d1c076fe745679ae61a2f682c3d2b1f903741f2b70bad93b5dc" gracePeriod=10 Mar 12 12:48:03.568516 master-0 kubenswrapper[13984]: I0312 12:48:03.567708 13984 generic.go:334] "Generic (PLEG): container finished" podID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerID="19e0c2ceb1955d1c076fe745679ae61a2f682c3d2b1f903741f2b70bad93b5dc" exitCode=0 Mar 12 12:48:03.568516 master-0 kubenswrapper[13984]: I0312 12:48:03.567764 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" event={"ID":"a1893ef7-895c-49c1-bfcc-468af72a46a6","Type":"ContainerDied","Data":"19e0c2ceb1955d1c076fe745679ae61a2f682c3d2b1f903741f2b70bad93b5dc"} Mar 12 12:48:03.865178 master-0 kubenswrapper[13984]: I0312 12:48:03.865124 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:48:03.975201 master-0 kubenswrapper[13984]: I0312 12:48:03.975138 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-nb\") pod \"a1893ef7-895c-49c1-bfcc-468af72a46a6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " Mar 12 12:48:03.975393 master-0 kubenswrapper[13984]: I0312 12:48:03.975355 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-svc\") pod \"a1893ef7-895c-49c1-bfcc-468af72a46a6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " Mar 12 12:48:03.975453 master-0 kubenswrapper[13984]: I0312 12:48:03.975427 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-sb\") pod \"a1893ef7-895c-49c1-bfcc-468af72a46a6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " Mar 12 12:48:03.975531 master-0 kubenswrapper[13984]: I0312 12:48:03.975459 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xpc2m\" (UniqueName: \"kubernetes.io/projected/a1893ef7-895c-49c1-bfcc-468af72a46a6-kube-api-access-xpc2m\") pod \"a1893ef7-895c-49c1-bfcc-468af72a46a6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " Mar 12 12:48:03.975609 master-0 kubenswrapper[13984]: I0312 12:48:03.975579 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-swift-storage-0\") pod \"a1893ef7-895c-49c1-bfcc-468af72a46a6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " Mar 12 12:48:03.975659 master-0 kubenswrapper[13984]: I0312 12:48:03.975622 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-config\") pod \"a1893ef7-895c-49c1-bfcc-468af72a46a6\" (UID: \"a1893ef7-895c-49c1-bfcc-468af72a46a6\") " Mar 12 12:48:03.985057 master-0 kubenswrapper[13984]: I0312 12:48:03.984308 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1893ef7-895c-49c1-bfcc-468af72a46a6-kube-api-access-xpc2m" (OuterVolumeSpecName: "kube-api-access-xpc2m") pod "a1893ef7-895c-49c1-bfcc-468af72a46a6" (UID: "a1893ef7-895c-49c1-bfcc-468af72a46a6"). InnerVolumeSpecName "kube-api-access-xpc2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:04.051623 master-0 kubenswrapper[13984]: I0312 12:48:04.050287 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a1893ef7-895c-49c1-bfcc-468af72a46a6" (UID: "a1893ef7-895c-49c1-bfcc-468af72a46a6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:04.053970 master-0 kubenswrapper[13984]: I0312 12:48:04.053902 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a1893ef7-895c-49c1-bfcc-468af72a46a6" (UID: "a1893ef7-895c-49c1-bfcc-468af72a46a6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:04.057962 master-0 kubenswrapper[13984]: I0312 12:48:04.057656 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-config" (OuterVolumeSpecName: "config") pod "a1893ef7-895c-49c1-bfcc-468af72a46a6" (UID: "a1893ef7-895c-49c1-bfcc-468af72a46a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:04.067714 master-0 kubenswrapper[13984]: I0312 12:48:04.067653 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a1893ef7-895c-49c1-bfcc-468af72a46a6" (UID: "a1893ef7-895c-49c1-bfcc-468af72a46a6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:04.079211 master-0 kubenswrapper[13984]: I0312 12:48:04.079165 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:04.079211 master-0 kubenswrapper[13984]: I0312 12:48:04.079208 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:04.079211 master-0 kubenswrapper[13984]: I0312 12:48:04.079218 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:04.079466 master-0 kubenswrapper[13984]: I0312 12:48:04.079229 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xpc2m\" (UniqueName: \"kubernetes.io/projected/a1893ef7-895c-49c1-bfcc-468af72a46a6-kube-api-access-xpc2m\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:04.079466 master-0 kubenswrapper[13984]: I0312 12:48:04.079251 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:04.102520 master-0 kubenswrapper[13984]: I0312 12:48:04.102444 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a1893ef7-895c-49c1-bfcc-468af72a46a6" (UID: "a1893ef7-895c-49c1-bfcc-468af72a46a6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:04.191375 master-0 kubenswrapper[13984]: I0312 12:48:04.191319 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a1893ef7-895c-49c1-bfcc-468af72a46a6-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:04.583232 master-0 kubenswrapper[13984]: I0312 12:48:04.583172 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cd96k" event={"ID":"fc8b81ae-1976-4abe-8d82-4decd232dd98","Type":"ContainerStarted","Data":"b9e7771a91d8bfa9b9fd1e995379658c2be0a95aece8fbe61fd23bc4ca1c52bf"} Mar 12 12:48:04.590228 master-0 kubenswrapper[13984]: I0312 12:48:04.590176 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" event={"ID":"a1893ef7-895c-49c1-bfcc-468af72a46a6","Type":"ContainerDied","Data":"1b74648070c6aa6c77655f42dff3b8e27cec60e8cba1149da994ef06d24c3504"} Mar 12 12:48:04.590419 master-0 kubenswrapper[13984]: I0312 12:48:04.590249 13984 scope.go:117] "RemoveContainer" containerID="19e0c2ceb1955d1c076fe745679ae61a2f682c3d2b1f903741f2b70bad93b5dc" Mar 12 12:48:04.590419 master-0 kubenswrapper[13984]: I0312 12:48:04.590411 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666f5ccff9-kdhq6" Mar 12 12:48:04.606493 master-0 kubenswrapper[13984]: I0312 12:48:04.606422 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:04.606586 master-0 kubenswrapper[13984]: I0312 12:48:04.606511 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:04.609536 master-0 kubenswrapper[13984]: I0312 12:48:04.609451 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cd96k" podStartSLOduration=6.880490624 podStartE2EDuration="17.609434271s" podCreationTimestamp="2026-03-12 12:47:47 +0000 UTC" firstStartedPulling="2026-03-12 12:47:52.822122201 +0000 UTC m=+1405.020137683" lastFinishedPulling="2026-03-12 12:48:03.551065848 +0000 UTC m=+1415.749081330" observedRunningTime="2026-03-12 12:48:04.602619259 +0000 UTC m=+1416.800634751" watchObservedRunningTime="2026-03-12 12:48:04.609434271 +0000 UTC m=+1416.807449763" Mar 12 12:48:04.650313 master-0 kubenswrapper[13984]: I0312 12:48:04.650244 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666f5ccff9-kdhq6"] Mar 12 12:48:04.651065 master-0 kubenswrapper[13984]: I0312 12:48:04.651006 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:04.652979 master-0 kubenswrapper[13984]: I0312 12:48:04.652941 13984 scope.go:117] "RemoveContainer" containerID="7d9d5482a5114e010655542a41042f34fd17740c8c1ca188ba9ee4c8c2937639" Mar 12 12:48:04.658233 master-0 kubenswrapper[13984]: I0312 12:48:04.658177 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:04.666663 master-0 kubenswrapper[13984]: I0312 12:48:04.666605 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666f5ccff9-kdhq6"] Mar 12 12:48:05.608025 master-0 kubenswrapper[13984]: I0312 12:48:05.607971 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:05.608025 master-0 kubenswrapper[13984]: I0312 12:48:05.608031 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:05.997162 master-0 kubenswrapper[13984]: I0312 12:48:05.997086 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" path="/var/lib/kubelet/pods/a1893ef7-895c-49c1-bfcc-468af72a46a6/volumes" Mar 12 12:48:06.119276 master-0 kubenswrapper[13984]: I0312 12:48:06.119200 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:06.119642 master-0 kubenswrapper[13984]: I0312 12:48:06.119468 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:06.163882 master-0 kubenswrapper[13984]: I0312 12:48:06.163810 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:06.179385 master-0 kubenswrapper[13984]: I0312 12:48:06.179078 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:06.616807 master-0 kubenswrapper[13984]: I0312 12:48:06.616758 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:06.616807 master-0 kubenswrapper[13984]: I0312 12:48:06.616799 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:09.216753 master-0 kubenswrapper[13984]: I0312 12:48:09.216678 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:09.217366 master-0 kubenswrapper[13984]: I0312 12:48:09.216825 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:48:09.251589 master-0 kubenswrapper[13984]: I0312 12:48:09.251450 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-external-api-0" Mar 12 12:48:09.719157 master-0 kubenswrapper[13984]: I0312 12:48:09.719079 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"e3b5768c37d0235b52cac894792cae6af357605cab7eb9e9b9c19515dd1cd74d"} Mar 12 12:48:09.733882 master-0 kubenswrapper[13984]: I0312 12:48:09.733446 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"c1b45d4823f2f06422f5f80517d463e31b014fd8c3c8903cff1e9052336a358c"} Mar 12 12:48:10.487344 master-0 kubenswrapper[13984]: I0312 12:48:10.487278 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:10.487933 master-0 kubenswrapper[13984]: I0312 12:48:10.487417 13984 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 12 12:48:10.495390 master-0 kubenswrapper[13984]: I0312 12:48:10.495328 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-f98a5-default-internal-api-0" Mar 12 12:48:10.759534 master-0 kubenswrapper[13984]: I0312 12:48:10.759373 13984 generic.go:334] "Generic (PLEG): container finished" podID="61638247-2447-4459-8be8-f91c2e718b46" containerID="c1b45d4823f2f06422f5f80517d463e31b014fd8c3c8903cff1e9052336a358c" exitCode=0 Mar 12 12:48:10.759800 master-0 kubenswrapper[13984]: I0312 12:48:10.759560 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerDied","Data":"c1b45d4823f2f06422f5f80517d463e31b014fd8c3c8903cff1e9052336a358c"} Mar 12 12:48:11.837449 master-0 kubenswrapper[13984]: I0312 12:48:11.837309 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"695f920ddcb316d5c3d7265b6d75625ac6e95babb7e7df6b8c5f3bba55ac74f3"} Mar 12 12:48:12.855499 master-0 kubenswrapper[13984]: I0312 12:48:12.855416 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"30b47ce58d9f669da22194e5899ffc3beb5ef39c9c8eb39a5edcad49ebd7467f"} Mar 12 12:48:12.855499 master-0 kubenswrapper[13984]: I0312 12:48:12.855472 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"6c8d23ebcc96dff93e79694c3466529e7c6129b03ed2851d800c41ee8fd57757"} Mar 12 12:48:13.892352 master-0 kubenswrapper[13984]: I0312 12:48:13.892272 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"43641d92af468d89aae439e3b30d429a851b24cbcd4fe53c2fbb66d02cc940ae"} Mar 12 12:48:13.892352 master-0 kubenswrapper[13984]: I0312 12:48:13.892355 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"61638247-2447-4459-8be8-f91c2e718b46","Type":"ContainerStarted","Data":"aed7b6a530898f2b163728dd1517cbb459eac0d3b6d28ed717a45214ea85cc5e"} Mar 12 12:48:13.894713 master-0 kubenswrapper[13984]: I0312 12:48:13.894658 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 12 12:48:13.894713 master-0 kubenswrapper[13984]: I0312 12:48:13.894733 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 12 12:48:13.941173 master-0 kubenswrapper[13984]: I0312 12:48:13.941061 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=8.87121965 podStartE2EDuration="18.941034089s" podCreationTimestamp="2026-03-12 12:47:55 +0000 UTC" firstStartedPulling="2026-03-12 12:47:58.489049106 +0000 UTC m=+1410.687064598" lastFinishedPulling="2026-03-12 12:48:08.558863545 +0000 UTC m=+1420.756879037" observedRunningTime="2026-03-12 12:48:13.939206426 +0000 UTC m=+1426.137221928" watchObservedRunningTime="2026-03-12 12:48:13.941034089 +0000 UTC m=+1426.139049581" Mar 12 12:48:15.851613 master-0 kubenswrapper[13984]: I0312 12:48:15.851549 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.851613 master-0 kubenswrapper[13984]: I0312 12:48:15.851625 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.852264 master-0 kubenswrapper[13984]: I0312 12:48:15.851679 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.852264 master-0 kubenswrapper[13984]: I0312 12:48:15.851694 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.883312 master-0 kubenswrapper[13984]: I0312 12:48:15.881753 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.885163 master-0 kubenswrapper[13984]: I0312 12:48:15.885120 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.928308 master-0 kubenswrapper[13984]: I0312 12:48:15.928110 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 12 12:48:15.930154 master-0 kubenswrapper[13984]: I0312 12:48:15.930119 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 12 12:48:16.993692 master-0 kubenswrapper[13984]: I0312 12:48:16.993638 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 12 12:48:17.943215 master-0 kubenswrapper[13984]: I0312 12:48:17.943162 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 12 12:48:23.032141 master-0 kubenswrapper[13984]: I0312 12:48:23.030390 13984 generic.go:334] "Generic (PLEG): container finished" podID="fc8b81ae-1976-4abe-8d82-4decd232dd98" containerID="b9e7771a91d8bfa9b9fd1e995379658c2be0a95aece8fbe61fd23bc4ca1c52bf" exitCode=0 Mar 12 12:48:23.032141 master-0 kubenswrapper[13984]: I0312 12:48:23.030457 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cd96k" event={"ID":"fc8b81ae-1976-4abe-8d82-4decd232dd98","Type":"ContainerDied","Data":"b9e7771a91d8bfa9b9fd1e995379658c2be0a95aece8fbe61fd23bc4ca1c52bf"} Mar 12 12:48:24.498392 master-0 kubenswrapper[13984]: I0312 12:48:24.498352 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:48:24.662445 master-0 kubenswrapper[13984]: I0312 12:48:24.662295 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-combined-ca-bundle\") pod \"fc8b81ae-1976-4abe-8d82-4decd232dd98\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " Mar 12 12:48:24.662662 master-0 kubenswrapper[13984]: I0312 12:48:24.662562 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rksns\" (UniqueName: \"kubernetes.io/projected/fc8b81ae-1976-4abe-8d82-4decd232dd98-kube-api-access-rksns\") pod \"fc8b81ae-1976-4abe-8d82-4decd232dd98\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " Mar 12 12:48:24.662721 master-0 kubenswrapper[13984]: I0312 12:48:24.662691 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-config-data\") pod \"fc8b81ae-1976-4abe-8d82-4decd232dd98\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " Mar 12 12:48:24.662814 master-0 kubenswrapper[13984]: I0312 12:48:24.662784 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-scripts\") pod \"fc8b81ae-1976-4abe-8d82-4decd232dd98\" (UID: \"fc8b81ae-1976-4abe-8d82-4decd232dd98\") " Mar 12 12:48:24.666672 master-0 kubenswrapper[13984]: I0312 12:48:24.666595 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8b81ae-1976-4abe-8d82-4decd232dd98-kube-api-access-rksns" (OuterVolumeSpecName: "kube-api-access-rksns") pod "fc8b81ae-1976-4abe-8d82-4decd232dd98" (UID: "fc8b81ae-1976-4abe-8d82-4decd232dd98"). InnerVolumeSpecName "kube-api-access-rksns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:24.667649 master-0 kubenswrapper[13984]: I0312 12:48:24.667518 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-scripts" (OuterVolumeSpecName: "scripts") pod "fc8b81ae-1976-4abe-8d82-4decd232dd98" (UID: "fc8b81ae-1976-4abe-8d82-4decd232dd98"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:24.698518 master-0 kubenswrapper[13984]: I0312 12:48:24.698453 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc8b81ae-1976-4abe-8d82-4decd232dd98" (UID: "fc8b81ae-1976-4abe-8d82-4decd232dd98"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:24.714704 master-0 kubenswrapper[13984]: I0312 12:48:24.714649 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-config-data" (OuterVolumeSpecName: "config-data") pod "fc8b81ae-1976-4abe-8d82-4decd232dd98" (UID: "fc8b81ae-1976-4abe-8d82-4decd232dd98"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:24.768420 master-0 kubenswrapper[13984]: I0312 12:48:24.766313 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rksns\" (UniqueName: \"kubernetes.io/projected/fc8b81ae-1976-4abe-8d82-4decd232dd98-kube-api-access-rksns\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:24.768420 master-0 kubenswrapper[13984]: I0312 12:48:24.766374 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:24.768420 master-0 kubenswrapper[13984]: I0312 12:48:24.766389 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:24.768420 master-0 kubenswrapper[13984]: I0312 12:48:24.766403 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8b81ae-1976-4abe-8d82-4decd232dd98-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:25.054454 master-0 kubenswrapper[13984]: I0312 12:48:25.054396 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cd96k" event={"ID":"fc8b81ae-1976-4abe-8d82-4decd232dd98","Type":"ContainerDied","Data":"d3762c4c159ca286391eafba0e257db35b1805c173d75ad454a2525795db954f"} Mar 12 12:48:25.054454 master-0 kubenswrapper[13984]: I0312 12:48:25.054448 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3762c4c159ca286391eafba0e257db35b1805c173d75ad454a2525795db954f" Mar 12 12:48:25.054454 master-0 kubenswrapper[13984]: I0312 12:48:25.054448 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cd96k" Mar 12 12:48:25.208886 master-0 kubenswrapper[13984]: I0312 12:48:25.208827 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: E0312 12:48:25.209332 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerName="init" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: I0312 12:48:25.209350 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerName="init" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: E0312 12:48:25.209361 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8b81ae-1976-4abe-8d82-4decd232dd98" containerName="nova-cell0-conductor-db-sync" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: I0312 12:48:25.209367 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8b81ae-1976-4abe-8d82-4decd232dd98" containerName="nova-cell0-conductor-db-sync" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: E0312 12:48:25.209385 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerName="dnsmasq-dns" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: I0312 12:48:25.209391 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerName="dnsmasq-dns" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: I0312 12:48:25.209665 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1893ef7-895c-49c1-bfcc-468af72a46a6" containerName="dnsmasq-dns" Mar 12 12:48:25.209996 master-0 kubenswrapper[13984]: I0312 12:48:25.209683 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8b81ae-1976-4abe-8d82-4decd232dd98" containerName="nova-cell0-conductor-db-sync" Mar 12 12:48:25.211691 master-0 kubenswrapper[13984]: I0312 12:48:25.210766 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.219581 master-0 kubenswrapper[13984]: I0312 12:48:25.216898 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 12 12:48:25.264956 master-0 kubenswrapper[13984]: I0312 12:48:25.254093 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 12:48:25.381952 master-0 kubenswrapper[13984]: I0312 12:48:25.381787 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5dlt\" (UniqueName: \"kubernetes.io/projected/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-kube-api-access-g5dlt\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.382189 master-0 kubenswrapper[13984]: I0312 12:48:25.381951 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.382422 master-0 kubenswrapper[13984]: I0312 12:48:25.382331 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.485633 master-0 kubenswrapper[13984]: I0312 12:48:25.485578 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.485882 master-0 kubenswrapper[13984]: I0312 12:48:25.485696 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5dlt\" (UniqueName: \"kubernetes.io/projected/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-kube-api-access-g5dlt\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.485882 master-0 kubenswrapper[13984]: I0312 12:48:25.485869 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.489169 master-0 kubenswrapper[13984]: I0312 12:48:25.489120 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.491157 master-0 kubenswrapper[13984]: I0312 12:48:25.491124 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.502337 master-0 kubenswrapper[13984]: I0312 12:48:25.502291 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5dlt\" (UniqueName: \"kubernetes.io/projected/a9ed2b2f-ae45-4aaa-b4ac-328625f9b852-kube-api-access-g5dlt\") pod \"nova-cell0-conductor-0\" (UID: \"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852\") " pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:25.554854 master-0 kubenswrapper[13984]: I0312 12:48:25.554784 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:26.052903 master-0 kubenswrapper[13984]: I0312 12:48:26.052780 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 12 12:48:26.070485 master-0 kubenswrapper[13984]: I0312 12:48:26.070412 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852","Type":"ContainerStarted","Data":"56d8ea34a2d54e9044f7fc84e1536a8bf5a2ad4824dbe1f43ee295cc9ec5941a"} Mar 12 12:48:27.082830 master-0 kubenswrapper[13984]: I0312 12:48:27.082770 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a9ed2b2f-ae45-4aaa-b4ac-328625f9b852","Type":"ContainerStarted","Data":"374e98b319b1a65dd84bdd76b51003ca3bb87b71172243d45463f79f7cc9bd0d"} Mar 12 12:48:27.111818 master-0 kubenswrapper[13984]: I0312 12:48:27.111746 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.11173101 podStartE2EDuration="2.11173101s" podCreationTimestamp="2026-03-12 12:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:48:27.104990321 +0000 UTC m=+1439.303005803" watchObservedRunningTime="2026-03-12 12:48:27.11173101 +0000 UTC m=+1439.309746502" Mar 12 12:48:28.095539 master-0 kubenswrapper[13984]: I0312 12:48:28.095444 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:35.584543 master-0 kubenswrapper[13984]: I0312 12:48:35.584496 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 12 12:48:36.153778 master-0 kubenswrapper[13984]: I0312 12:48:36.153707 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vd6w8"] Mar 12 12:48:36.155718 master-0 kubenswrapper[13984]: I0312 12:48:36.155671 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.164580 master-0 kubenswrapper[13984]: I0312 12:48:36.164504 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vd6w8"] Mar 12 12:48:36.200370 master-0 kubenswrapper[13984]: I0312 12:48:36.200057 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 12 12:48:36.200370 master-0 kubenswrapper[13984]: I0312 12:48:36.200201 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 12 12:48:36.325425 master-0 kubenswrapper[13984]: I0312 12:48:36.325119 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 12 12:48:36.330507 master-0 kubenswrapper[13984]: I0312 12:48:36.328087 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.335519 master-0 kubenswrapper[13984]: I0312 12:48:36.335441 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 12 12:48:36.338431 master-0 kubenswrapper[13984]: I0312 12:48:36.338364 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 12 12:48:36.354555 master-0 kubenswrapper[13984]: I0312 12:48:36.351361 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d95490-fdae-4ecb-961d-553a7fda1436-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.354555 master-0 kubenswrapper[13984]: I0312 12:48:36.351425 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-config-data\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.354555 master-0 kubenswrapper[13984]: I0312 12:48:36.351524 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqbph\" (UniqueName: \"kubernetes.io/projected/f6d95490-fdae-4ecb-961d-553a7fda1436-kube-api-access-vqbph\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.354555 master-0 kubenswrapper[13984]: I0312 12:48:36.351579 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-scripts\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.354964 master-0 kubenswrapper[13984]: I0312 12:48:36.354681 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.354964 master-0 kubenswrapper[13984]: I0312 12:48:36.354763 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdmzn\" (UniqueName: \"kubernetes.io/projected/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-kube-api-access-bdmzn\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.354964 master-0 kubenswrapper[13984]: I0312 12:48:36.354858 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d95490-fdae-4ecb-961d-553a7fda1436-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.463849 master-0 kubenswrapper[13984]: I0312 12:48:36.463741 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-scripts\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.464237 master-0 kubenswrapper[13984]: I0312 12:48:36.464205 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.464348 master-0 kubenswrapper[13984]: I0312 12:48:36.464329 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdmzn\" (UniqueName: \"kubernetes.io/projected/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-kube-api-access-bdmzn\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.464470 master-0 kubenswrapper[13984]: I0312 12:48:36.464420 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d95490-fdae-4ecb-961d-553a7fda1436-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.464700 master-0 kubenswrapper[13984]: I0312 12:48:36.464627 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d95490-fdae-4ecb-961d-553a7fda1436-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.472763 master-0 kubenswrapper[13984]: I0312 12:48:36.465253 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-config-data\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.472763 master-0 kubenswrapper[13984]: I0312 12:48:36.465416 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqbph\" (UniqueName: \"kubernetes.io/projected/f6d95490-fdae-4ecb-961d-553a7fda1436-kube-api-access-vqbph\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.478886 master-0 kubenswrapper[13984]: I0312 12:48:36.476400 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-scripts\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.478886 master-0 kubenswrapper[13984]: I0312 12:48:36.477756 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6d95490-fdae-4ecb-961d-553a7fda1436-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.490646 master-0 kubenswrapper[13984]: I0312 12:48:36.487636 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6d95490-fdae-4ecb-961d-553a7fda1436-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.490646 master-0 kubenswrapper[13984]: I0312 12:48:36.490139 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-config-data\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.490646 master-0 kubenswrapper[13984]: I0312 12:48:36.490626 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.501333 master-0 kubenswrapper[13984]: I0312 12:48:36.501291 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:36.503971 master-0 kubenswrapper[13984]: I0312 12:48:36.503948 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:48:36.528212 master-0 kubenswrapper[13984]: I0312 12:48:36.518802 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 12:48:36.528212 master-0 kubenswrapper[13984]: I0312 12:48:36.525653 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqbph\" (UniqueName: \"kubernetes.io/projected/f6d95490-fdae-4ecb-961d-553a7fda1436-kube-api-access-vqbph\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"f6d95490-fdae-4ecb-961d-553a7fda1436\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.561178 master-0 kubenswrapper[13984]: I0312 12:48:36.559307 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdmzn\" (UniqueName: \"kubernetes.io/projected/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-kube-api-access-bdmzn\") pod \"nova-cell0-cell-mapping-vd6w8\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.566318 master-0 kubenswrapper[13984]: I0312 12:48:36.563263 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:36.587342 master-0 kubenswrapper[13984]: I0312 12:48:36.578319 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:36.601502 master-0 kubenswrapper[13984]: I0312 12:48:36.595614 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:36.611580 master-0 kubenswrapper[13984]: I0312 12:48:36.610062 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 12:48:36.636497 master-0 kubenswrapper[13984]: I0312 12:48:36.634656 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:36.664636 master-0 kubenswrapper[13984]: I0312 12:48:36.662863 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:48:36.670055 master-0 kubenswrapper[13984]: I0312 12:48:36.669160 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:36.677867 master-0 kubenswrapper[13984]: I0312 12:48:36.677146 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.681167 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.683281 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-config-data\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.683391 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-config-data\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.683430 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a6896c-57a5-4e52-8db8-5224f277cfcb-logs\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.683493 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.692867 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e406a2b-18fa-448b-888e-9e6a95bbe747-logs\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.693127 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmb9l\" (UniqueName: \"kubernetes.io/projected/11a6896c-57a5-4e52-8db8-5224f277cfcb-kube-api-access-rmb9l\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.693188 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.693521 master-0 kubenswrapper[13984]: I0312 12:48:36.693459 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2ss8\" (UniqueName: \"kubernetes.io/projected/0e406a2b-18fa-448b-888e-9e6a95bbe747-kube-api-access-b2ss8\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.730500 master-0 kubenswrapper[13984]: I0312 12:48:36.730448 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:48:36.801328 master-0 kubenswrapper[13984]: I0312 12:48:36.801270 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-config-data\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.801328 master-0 kubenswrapper[13984]: I0312 12:48:36.801330 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a6896c-57a5-4e52-8db8-5224f277cfcb-logs\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.801579 master-0 kubenswrapper[13984]: I0312 12:48:36.801359 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.801579 master-0 kubenswrapper[13984]: I0312 12:48:36.801391 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e406a2b-18fa-448b-888e-9e6a95bbe747-logs\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.801579 master-0 kubenswrapper[13984]: I0312 12:48:36.801416 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srqg\" (UniqueName: \"kubernetes.io/projected/295668ef-83e3-408c-a447-34e6a52cf61e-kube-api-access-7srqg\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.801579 master-0 kubenswrapper[13984]: I0312 12:48:36.801490 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmb9l\" (UniqueName: \"kubernetes.io/projected/11a6896c-57a5-4e52-8db8-5224f277cfcb-kube-api-access-rmb9l\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.801579 master-0 kubenswrapper[13984]: I0312 12:48:36.801524 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.801742 master-0 kubenswrapper[13984]: I0312 12:48:36.801602 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.801742 master-0 kubenswrapper[13984]: I0312 12:48:36.801654 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2ss8\" (UniqueName: \"kubernetes.io/projected/0e406a2b-18fa-448b-888e-9e6a95bbe747-kube-api-access-b2ss8\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.801742 master-0 kubenswrapper[13984]: I0312 12:48:36.801673 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.801742 master-0 kubenswrapper[13984]: I0312 12:48:36.801692 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-config-data\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.831616 master-0 kubenswrapper[13984]: I0312 12:48:36.820291 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-config-data\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.831616 master-0 kubenswrapper[13984]: I0312 12:48:36.820648 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a6896c-57a5-4e52-8db8-5224f277cfcb-logs\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.831616 master-0 kubenswrapper[13984]: I0312 12:48:36.829957 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e406a2b-18fa-448b-888e-9e6a95bbe747-logs\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.831616 master-0 kubenswrapper[13984]: I0312 12:48:36.830222 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.838217 master-0 kubenswrapper[13984]: I0312 12:48:36.838095 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:36.848501 master-0 kubenswrapper[13984]: I0312 12:48:36.845088 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:48:36.848501 master-0 kubenswrapper[13984]: I0312 12:48:36.845671 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2ss8\" (UniqueName: \"kubernetes.io/projected/0e406a2b-18fa-448b-888e-9e6a95bbe747-kube-api-access-b2ss8\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.848501 master-0 kubenswrapper[13984]: I0312 12:48:36.846910 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:48:36.854608 master-0 kubenswrapper[13984]: I0312 12:48:36.849293 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-config-data\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.854608 master-0 kubenswrapper[13984]: I0312 12:48:36.849885 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " pod="openstack/nova-api-0" Mar 12 12:48:36.861760 master-0 kubenswrapper[13984]: I0312 12:48:36.860368 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 12:48:36.878130 master-0 kubenswrapper[13984]: I0312 12:48:36.876330 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmb9l\" (UniqueName: \"kubernetes.io/projected/11a6896c-57a5-4e52-8db8-5224f277cfcb-kube-api-access-rmb9l\") pod \"nova-metadata-0\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " pod="openstack/nova-metadata-0" Mar 12 12:48:36.904728 master-0 kubenswrapper[13984]: I0312 12:48:36.904677 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:48:36.908980 master-0 kubenswrapper[13984]: I0312 12:48:36.908946 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.909080 master-0 kubenswrapper[13984]: I0312 12:48:36.909020 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.909159 master-0 kubenswrapper[13984]: I0312 12:48:36.909098 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srqg\" (UniqueName: \"kubernetes.io/projected/295668ef-83e3-408c-a447-34e6a52cf61e-kube-api-access-7srqg\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.913145 master-0 kubenswrapper[13984]: I0312 12:48:36.913120 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.915233 master-0 kubenswrapper[13984]: I0312 12:48:36.915205 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:36.926717 master-0 kubenswrapper[13984]: I0312 12:48:36.926426 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srqg\" (UniqueName: \"kubernetes.io/projected/295668ef-83e3-408c-a447-34e6a52cf61e-kube-api-access-7srqg\") pod \"nova-cell1-novncproxy-0\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:37.011605 master-0 kubenswrapper[13984]: I0312 12:48:37.011120 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9dqb\" (UniqueName: \"kubernetes.io/projected/be2c270c-f966-42c6-858d-7392b2902c0d-kube-api-access-z9dqb\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.011605 master-0 kubenswrapper[13984]: I0312 12:48:37.011439 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-config-data\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.011605 master-0 kubenswrapper[13984]: I0312 12:48:37.011473 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.031764 master-0 kubenswrapper[13984]: I0312 12:48:37.031145 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:48:37.073536 master-0 kubenswrapper[13984]: I0312 12:48:37.073411 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64fbc4c555-858hz"] Mar 12 12:48:37.076776 master-0 kubenswrapper[13984]: I0312 12:48:37.076727 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.089385 master-0 kubenswrapper[13984]: I0312 12:48:37.087726 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fbc4c555-858hz"] Mar 12 12:48:37.114663 master-0 kubenswrapper[13984]: I0312 12:48:37.114529 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.114663 master-0 kubenswrapper[13984]: I0312 12:48:37.114606 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-config-data\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.115331 master-0 kubenswrapper[13984]: I0312 12:48:37.115247 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9dqb\" (UniqueName: \"kubernetes.io/projected/be2c270c-f966-42c6-858d-7392b2902c0d-kube-api-access-z9dqb\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.141003 master-0 kubenswrapper[13984]: I0312 12:48:37.124662 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.141003 master-0 kubenswrapper[13984]: I0312 12:48:37.125182 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-config-data\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.142952 master-0 kubenswrapper[13984]: I0312 12:48:37.141334 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9dqb\" (UniqueName: \"kubernetes.io/projected/be2c270c-f966-42c6-858d-7392b2902c0d-kube-api-access-z9dqb\") pod \"nova-scheduler-0\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " pod="openstack/nova-scheduler-0" Mar 12 12:48:37.165712 master-0 kubenswrapper[13984]: I0312 12:48:37.165639 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:37.207161 master-0 kubenswrapper[13984]: I0312 12:48:37.207039 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:37.220983 master-0 kubenswrapper[13984]: I0312 12:48:37.220741 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-config\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.220983 master-0 kubenswrapper[13984]: I0312 12:48:37.220831 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-nb\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.220983 master-0 kubenswrapper[13984]: I0312 12:48:37.220904 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-sb\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.220983 master-0 kubenswrapper[13984]: I0312 12:48:37.220937 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-svc\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.220983 master-0 kubenswrapper[13984]: I0312 12:48:37.220970 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-swift-storage-0\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.221312 master-0 kubenswrapper[13984]: I0312 12:48:37.221084 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbc5w\" (UniqueName: \"kubernetes.io/projected/cc025fea-4c26-4327-b33c-0f29566b6b50-kube-api-access-nbc5w\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.243911 master-0 kubenswrapper[13984]: I0312 12:48:37.222792 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:48:37.329136 master-0 kubenswrapper[13984]: I0312 12:48:37.328887 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbc5w\" (UniqueName: \"kubernetes.io/projected/cc025fea-4c26-4327-b33c-0f29566b6b50-kube-api-access-nbc5w\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.329136 master-0 kubenswrapper[13984]: I0312 12:48:37.328970 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-config\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.329136 master-0 kubenswrapper[13984]: I0312 12:48:37.329051 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-nb\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.329136 master-0 kubenswrapper[13984]: I0312 12:48:37.329144 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-sb\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.329471 master-0 kubenswrapper[13984]: I0312 12:48:37.329183 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-svc\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.329471 master-0 kubenswrapper[13984]: I0312 12:48:37.329214 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-swift-storage-0\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.330278 master-0 kubenswrapper[13984]: I0312 12:48:37.330242 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-swift-storage-0\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.331308 master-0 kubenswrapper[13984]: I0312 12:48:37.331254 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-config\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.346696 master-0 kubenswrapper[13984]: I0312 12:48:37.336537 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-svc\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.346696 master-0 kubenswrapper[13984]: I0312 12:48:37.337776 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-sb\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.346696 master-0 kubenswrapper[13984]: I0312 12:48:37.341135 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-nb\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.346696 master-0 kubenswrapper[13984]: I0312 12:48:37.346558 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 12 12:48:37.407322 master-0 kubenswrapper[13984]: I0312 12:48:37.364555 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rf48n"] Mar 12 12:48:37.407322 master-0 kubenswrapper[13984]: I0312 12:48:37.366572 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.436464 master-0 kubenswrapper[13984]: I0312 12:48:37.435406 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbc5w\" (UniqueName: \"kubernetes.io/projected/cc025fea-4c26-4327-b33c-0f29566b6b50-kube-api-access-nbc5w\") pod \"dnsmasq-dns-64fbc4c555-858hz\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.442670 master-0 kubenswrapper[13984]: I0312 12:48:37.442595 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 12:48:37.444027 master-0 kubenswrapper[13984]: I0312 12:48:37.443940 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 12 12:48:37.451587 master-0 kubenswrapper[13984]: I0312 12:48:37.451533 13984 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:48:37.518930 master-0 kubenswrapper[13984]: I0312 12:48:37.465958 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.518930 master-0 kubenswrapper[13984]: I0312 12:48:37.466140 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-scripts\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.518930 master-0 kubenswrapper[13984]: I0312 12:48:37.466397 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2jt2\" (UniqueName: \"kubernetes.io/projected/ea457152-df63-450a-be0c-aaca7e72b0b4-kube-api-access-n2jt2\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.518930 master-0 kubenswrapper[13984]: I0312 12:48:37.466509 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-config-data\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.574604 master-0 kubenswrapper[13984]: I0312 12:48:37.574559 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.579139 master-0 kubenswrapper[13984]: I0312 12:48:37.579101 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-scripts\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.580345 master-0 kubenswrapper[13984]: I0312 12:48:37.580322 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2jt2\" (UniqueName: \"kubernetes.io/projected/ea457152-df63-450a-be0c-aaca7e72b0b4-kube-api-access-n2jt2\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.581423 master-0 kubenswrapper[13984]: I0312 12:48:37.581380 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-scripts\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.581899 master-0 kubenswrapper[13984]: I0312 12:48:37.581879 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-config-data\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.583085 master-0 kubenswrapper[13984]: I0312 12:48:37.583028 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rf48n"] Mar 12 12:48:37.583462 master-0 kubenswrapper[13984]: I0312 12:48:37.583424 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.590126 master-0 kubenswrapper[13984]: I0312 12:48:37.590060 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-config-data\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.671124 master-0 kubenswrapper[13984]: I0312 12:48:37.671021 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2jt2\" (UniqueName: \"kubernetes.io/projected/ea457152-df63-450a-be0c-aaca7e72b0b4-kube-api-access-n2jt2\") pod \"nova-cell1-conductor-db-sync-rf48n\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:37.703002 master-0 kubenswrapper[13984]: I0312 12:48:37.700339 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:37.738800 master-0 kubenswrapper[13984]: I0312 12:48:37.738711 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vd6w8"] Mar 12 12:48:37.765093 master-0 kubenswrapper[13984]: I0312 12:48:37.762835 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:38.097713 master-0 kubenswrapper[13984]: I0312 12:48:38.095963 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:48:38.120028 master-0 kubenswrapper[13984]: I0312 12:48:38.119928 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:48:38.140885 master-0 kubenswrapper[13984]: I0312 12:48:38.138983 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:38.152708 master-0 kubenswrapper[13984]: I0312 12:48:38.152417 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:38.271771 master-0 kubenswrapper[13984]: I0312 12:48:38.271719 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64fbc4c555-858hz"] Mar 12 12:48:38.282571 master-0 kubenswrapper[13984]: I0312 12:48:38.281295 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be2c270c-f966-42c6-858d-7392b2902c0d","Type":"ContainerStarted","Data":"3ef4590d6d6e31d6fd3ceb975358a7354ef07d1f881d3cc4fdbefeaa13cc7b2d"} Mar 12 12:48:38.285053 master-0 kubenswrapper[13984]: I0312 12:48:38.284614 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e406a2b-18fa-448b-888e-9e6a95bbe747","Type":"ContainerStarted","Data":"62b859158088b79415fb2c90f02bb96f92d7a40ad3ea82146f3b1dfb9252e154"} Mar 12 12:48:38.288470 master-0 kubenswrapper[13984]: I0312 12:48:38.286792 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"295668ef-83e3-408c-a447-34e6a52cf61e","Type":"ContainerStarted","Data":"94bdbea22967a8d8ce6d69d97fa63569973685ab248ef46772141dee0448d3f6"} Mar 12 12:48:38.288639 master-0 kubenswrapper[13984]: I0312 12:48:38.288391 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" event={"ID":"cc025fea-4c26-4327-b33c-0f29566b6b50","Type":"ContainerStarted","Data":"72ca92dd755c16daf3b54552c75ecb61052592e0cae074b0e1275e4393b3ca06"} Mar 12 12:48:38.290677 master-0 kubenswrapper[13984]: I0312 12:48:38.290649 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"f6d95490-fdae-4ecb-961d-553a7fda1436","Type":"ContainerStarted","Data":"17cc1cf3af0100c260d24217dc4c6f012fc8e1f02ab0dcc8bd8dbcc62f665ec1"} Mar 12 12:48:38.294514 master-0 kubenswrapper[13984]: I0312 12:48:38.294486 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vd6w8" event={"ID":"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e","Type":"ContainerStarted","Data":"992d46a6738bd355d5fe196354a0271b648731fd3583de6c8c51344d25800e0a"} Mar 12 12:48:38.294514 master-0 kubenswrapper[13984]: I0312 12:48:38.294514 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vd6w8" event={"ID":"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e","Type":"ContainerStarted","Data":"8d317ace1c7b737a2ed39a0186f8f2fceef4ca1dad786ae42394415e40ce5f98"} Mar 12 12:48:38.296013 master-0 kubenswrapper[13984]: I0312 12:48:38.295990 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11a6896c-57a5-4e52-8db8-5224f277cfcb","Type":"ContainerStarted","Data":"d7c1c0b80e458b26981f12c4292c6140120032dfb2870ddc4b3350800b803470"} Mar 12 12:48:38.447516 master-0 kubenswrapper[13984]: I0312 12:48:38.447430 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vd6w8" podStartSLOduration=2.447411834 podStartE2EDuration="2.447411834s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:48:38.323107718 +0000 UTC m=+1450.521123220" watchObservedRunningTime="2026-03-12 12:48:38.447411834 +0000 UTC m=+1450.645427326" Mar 12 12:48:38.459379 master-0 kubenswrapper[13984]: W0312 12:48:38.459340 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podea457152_df63_450a_be0c_aaca7e72b0b4.slice/crio-f74e11103f78bc62152200e9759a500cab51afe893968503fbed0e326fa8282a WatchSource:0}: Error finding container f74e11103f78bc62152200e9759a500cab51afe893968503fbed0e326fa8282a: Status 404 returned error can't find the container with id f74e11103f78bc62152200e9759a500cab51afe893968503fbed0e326fa8282a Mar 12 12:48:38.464849 master-0 kubenswrapper[13984]: I0312 12:48:38.462985 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rf48n"] Mar 12 12:48:39.332694 master-0 kubenswrapper[13984]: I0312 12:48:39.332639 13984 generic.go:334] "Generic (PLEG): container finished" podID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerID="3025f131d59f0777650800ba419a85418d16f72c075108ce2a2e97b05f16ac97" exitCode=0 Mar 12 12:48:39.333397 master-0 kubenswrapper[13984]: I0312 12:48:39.333370 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" event={"ID":"cc025fea-4c26-4327-b33c-0f29566b6b50","Type":"ContainerDied","Data":"3025f131d59f0777650800ba419a85418d16f72c075108ce2a2e97b05f16ac97"} Mar 12 12:48:39.377524 master-0 kubenswrapper[13984]: I0312 12:48:39.364772 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rf48n" event={"ID":"ea457152-df63-450a-be0c-aaca7e72b0b4","Type":"ContainerStarted","Data":"7c06ae70f62ae63073364d3a235d96da95cf089bef58484ccc550a4d314281e0"} Mar 12 12:48:39.377524 master-0 kubenswrapper[13984]: I0312 12:48:39.364817 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rf48n" event={"ID":"ea457152-df63-450a-be0c-aaca7e72b0b4","Type":"ContainerStarted","Data":"f74e11103f78bc62152200e9759a500cab51afe893968503fbed0e326fa8282a"} Mar 12 12:48:39.466779 master-0 kubenswrapper[13984]: I0312 12:48:39.466649 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rf48n" podStartSLOduration=2.466586369 podStartE2EDuration="2.466586369s" podCreationTimestamp="2026-03-12 12:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:48:39.390025521 +0000 UTC m=+1451.588041013" watchObservedRunningTime="2026-03-12 12:48:39.466586369 +0000 UTC m=+1451.664601861" Mar 12 12:48:40.397514 master-0 kubenswrapper[13984]: I0312 12:48:40.395609 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" event={"ID":"cc025fea-4c26-4327-b33c-0f29566b6b50","Type":"ContainerStarted","Data":"3a00fa6fc636cdb063a263fc33e09c250a731ec1bc0c4460258c49ff774d8c5e"} Mar 12 12:48:40.397514 master-0 kubenswrapper[13984]: I0312 12:48:40.395739 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:40.429511 master-0 kubenswrapper[13984]: I0312 12:48:40.424894 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" podStartSLOduration=4.424870366 podStartE2EDuration="4.424870366s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:48:40.421795512 +0000 UTC m=+1452.619811004" watchObservedRunningTime="2026-03-12 12:48:40.424870366 +0000 UTC m=+1452.622885868" Mar 12 12:48:41.226675 master-0 kubenswrapper[13984]: I0312 12:48:41.225687 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:41.245503 master-0 kubenswrapper[13984]: I0312 12:48:41.242019 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:48:42.490328 master-0 kubenswrapper[13984]: I0312 12:48:42.490187 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="295668ef-83e3-408c-a447-34e6a52cf61e" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958" gracePeriod=30 Mar 12 12:48:42.500462 master-0 kubenswrapper[13984]: I0312 12:48:42.490622 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"295668ef-83e3-408c-a447-34e6a52cf61e","Type":"ContainerStarted","Data":"6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958"} Mar 12 12:48:42.500462 master-0 kubenswrapper[13984]: I0312 12:48:42.495931 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be2c270c-f966-42c6-858d-7392b2902c0d","Type":"ContainerStarted","Data":"62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5"} Mar 12 12:48:42.501582 master-0 kubenswrapper[13984]: I0312 12:48:42.499989 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e406a2b-18fa-448b-888e-9e6a95bbe747","Type":"ContainerStarted","Data":"8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242"} Mar 12 12:48:42.522863 master-0 kubenswrapper[13984]: I0312 12:48:42.513166 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11a6896c-57a5-4e52-8db8-5224f277cfcb","Type":"ContainerStarted","Data":"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80"} Mar 12 12:48:42.545821 master-0 kubenswrapper[13984]: I0312 12:48:42.543643 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.7784298339999998 podStartE2EDuration="6.543622101s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="2026-03-12 12:48:38.123748135 +0000 UTC m=+1450.321763627" lastFinishedPulling="2026-03-12 12:48:41.888940402 +0000 UTC m=+1454.086955894" observedRunningTime="2026-03-12 12:48:42.512508218 +0000 UTC m=+1454.710523710" watchObservedRunningTime="2026-03-12 12:48:42.543622101 +0000 UTC m=+1454.741637593" Mar 12 12:48:42.563601 master-0 kubenswrapper[13984]: I0312 12:48:42.563218 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.763290228 podStartE2EDuration="6.563197887s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="2026-03-12 12:48:38.132663899 +0000 UTC m=+1450.330679391" lastFinishedPulling="2026-03-12 12:48:41.932571558 +0000 UTC m=+1454.130587050" observedRunningTime="2026-03-12 12:48:42.549815791 +0000 UTC m=+1454.747831313" watchObservedRunningTime="2026-03-12 12:48:42.563197887 +0000 UTC m=+1454.761213379" Mar 12 12:48:43.537905 master-0 kubenswrapper[13984]: I0312 12:48:43.537846 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11a6896c-57a5-4e52-8db8-5224f277cfcb","Type":"ContainerStarted","Data":"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398"} Mar 12 12:48:43.539969 master-0 kubenswrapper[13984]: I0312 12:48:43.537960 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-log" containerID="cri-o://e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80" gracePeriod=30 Mar 12 12:48:43.539969 master-0 kubenswrapper[13984]: I0312 12:48:43.538043 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-metadata" containerID="cri-o://2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398" gracePeriod=30 Mar 12 12:48:43.542984 master-0 kubenswrapper[13984]: I0312 12:48:43.542935 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e406a2b-18fa-448b-888e-9e6a95bbe747","Type":"ContainerStarted","Data":"ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0"} Mar 12 12:48:43.611412 master-0 kubenswrapper[13984]: I0312 12:48:43.608664 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.796665553 podStartE2EDuration="7.608643272s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="2026-03-12 12:48:38.149155191 +0000 UTC m=+1450.347170683" lastFinishedPulling="2026-03-12 12:48:41.96113291 +0000 UTC m=+1454.159148402" observedRunningTime="2026-03-12 12:48:43.563217867 +0000 UTC m=+1455.761233379" watchObservedRunningTime="2026-03-12 12:48:43.608643272 +0000 UTC m=+1455.806658754" Mar 12 12:48:43.626573 master-0 kubenswrapper[13984]: I0312 12:48:43.625695 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.858997741 podStartE2EDuration="7.625676989s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="2026-03-12 12:48:38.130931262 +0000 UTC m=+1450.328946754" lastFinishedPulling="2026-03-12 12:48:41.89761051 +0000 UTC m=+1454.095626002" observedRunningTime="2026-03-12 12:48:43.596195031 +0000 UTC m=+1455.794210533" watchObservedRunningTime="2026-03-12 12:48:43.625676989 +0000 UTC m=+1455.823692481" Mar 12 12:48:44.335540 master-0 kubenswrapper[13984]: I0312 12:48:44.335455 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:44.356811 master-0 kubenswrapper[13984]: I0312 12:48:44.356756 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a6896c-57a5-4e52-8db8-5224f277cfcb-logs\") pod \"11a6896c-57a5-4e52-8db8-5224f277cfcb\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " Mar 12 12:48:44.357285 master-0 kubenswrapper[13984]: I0312 12:48:44.357262 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmb9l\" (UniqueName: \"kubernetes.io/projected/11a6896c-57a5-4e52-8db8-5224f277cfcb-kube-api-access-rmb9l\") pod \"11a6896c-57a5-4e52-8db8-5224f277cfcb\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " Mar 12 12:48:44.357352 master-0 kubenswrapper[13984]: I0312 12:48:44.357333 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-config-data\") pod \"11a6896c-57a5-4e52-8db8-5224f277cfcb\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " Mar 12 12:48:44.357401 master-0 kubenswrapper[13984]: I0312 12:48:44.357366 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-combined-ca-bundle\") pod \"11a6896c-57a5-4e52-8db8-5224f277cfcb\" (UID: \"11a6896c-57a5-4e52-8db8-5224f277cfcb\") " Mar 12 12:48:44.357747 master-0 kubenswrapper[13984]: I0312 12:48:44.357627 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11a6896c-57a5-4e52-8db8-5224f277cfcb-logs" (OuterVolumeSpecName: "logs") pod "11a6896c-57a5-4e52-8db8-5224f277cfcb" (UID: "11a6896c-57a5-4e52-8db8-5224f277cfcb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:48:44.368062 master-0 kubenswrapper[13984]: I0312 12:48:44.368013 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11a6896c-57a5-4e52-8db8-5224f277cfcb-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:44.375801 master-0 kubenswrapper[13984]: I0312 12:48:44.375734 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11a6896c-57a5-4e52-8db8-5224f277cfcb-kube-api-access-rmb9l" (OuterVolumeSpecName: "kube-api-access-rmb9l") pod "11a6896c-57a5-4e52-8db8-5224f277cfcb" (UID: "11a6896c-57a5-4e52-8db8-5224f277cfcb"). InnerVolumeSpecName "kube-api-access-rmb9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:44.414204 master-0 kubenswrapper[13984]: I0312 12:48:44.414148 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11a6896c-57a5-4e52-8db8-5224f277cfcb" (UID: "11a6896c-57a5-4e52-8db8-5224f277cfcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:44.420706 master-0 kubenswrapper[13984]: I0312 12:48:44.420659 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-config-data" (OuterVolumeSpecName: "config-data") pod "11a6896c-57a5-4e52-8db8-5224f277cfcb" (UID: "11a6896c-57a5-4e52-8db8-5224f277cfcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:44.471703 master-0 kubenswrapper[13984]: I0312 12:48:44.471566 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmb9l\" (UniqueName: \"kubernetes.io/projected/11a6896c-57a5-4e52-8db8-5224f277cfcb-kube-api-access-rmb9l\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:44.471703 master-0 kubenswrapper[13984]: I0312 12:48:44.471615 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:44.471703 master-0 kubenswrapper[13984]: I0312 12:48:44.471638 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11a6896c-57a5-4e52-8db8-5224f277cfcb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:44.569183 master-0 kubenswrapper[13984]: I0312 12:48:44.569138 13984 generic.go:334] "Generic (PLEG): container finished" podID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerID="2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398" exitCode=0 Mar 12 12:48:44.569183 master-0 kubenswrapper[13984]: I0312 12:48:44.569176 13984 generic.go:334] "Generic (PLEG): container finished" podID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerID="e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80" exitCode=143 Mar 12 12:48:44.570571 master-0 kubenswrapper[13984]: I0312 12:48:44.570530 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:44.574303 master-0 kubenswrapper[13984]: I0312 12:48:44.574218 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11a6896c-57a5-4e52-8db8-5224f277cfcb","Type":"ContainerDied","Data":"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398"} Mar 12 12:48:44.574421 master-0 kubenswrapper[13984]: I0312 12:48:44.574310 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11a6896c-57a5-4e52-8db8-5224f277cfcb","Type":"ContainerDied","Data":"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80"} Mar 12 12:48:44.574421 master-0 kubenswrapper[13984]: I0312 12:48:44.574323 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11a6896c-57a5-4e52-8db8-5224f277cfcb","Type":"ContainerDied","Data":"d7c1c0b80e458b26981f12c4292c6140120032dfb2870ddc4b3350800b803470"} Mar 12 12:48:44.574421 master-0 kubenswrapper[13984]: I0312 12:48:44.574339 13984 scope.go:117] "RemoveContainer" containerID="2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398" Mar 12 12:48:44.614547 master-0 kubenswrapper[13984]: I0312 12:48:44.614429 13984 scope.go:117] "RemoveContainer" containerID="e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80" Mar 12 12:48:44.647892 master-0 kubenswrapper[13984]: I0312 12:48:44.647814 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:44.665681 master-0 kubenswrapper[13984]: I0312 12:48:44.665555 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:44.666551 master-0 kubenswrapper[13984]: I0312 12:48:44.666521 13984 scope.go:117] "RemoveContainer" containerID="2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398" Mar 12 12:48:44.670092 master-0 kubenswrapper[13984]: E0312 12:48:44.669259 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398\": container with ID starting with 2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398 not found: ID does not exist" containerID="2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398" Mar 12 12:48:44.670428 master-0 kubenswrapper[13984]: I0312 12:48:44.670379 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398"} err="failed to get container status \"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398\": rpc error: code = NotFound desc = could not find container \"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398\": container with ID starting with 2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398 not found: ID does not exist" Mar 12 12:48:44.670548 master-0 kubenswrapper[13984]: I0312 12:48:44.670531 13984 scope.go:117] "RemoveContainer" containerID="e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80" Mar 12 12:48:44.671126 master-0 kubenswrapper[13984]: E0312 12:48:44.671102 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80\": container with ID starting with e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80 not found: ID does not exist" containerID="e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80" Mar 12 12:48:44.671262 master-0 kubenswrapper[13984]: I0312 12:48:44.671238 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80"} err="failed to get container status \"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80\": rpc error: code = NotFound desc = could not find container \"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80\": container with ID starting with e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80 not found: ID does not exist" Mar 12 12:48:44.671359 master-0 kubenswrapper[13984]: I0312 12:48:44.671344 13984 scope.go:117] "RemoveContainer" containerID="2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398" Mar 12 12:48:44.671693 master-0 kubenswrapper[13984]: I0312 12:48:44.671670 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398"} err="failed to get container status \"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398\": rpc error: code = NotFound desc = could not find container \"2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398\": container with ID starting with 2d71b1077d998f485c8ef50db5d56bcd9257196a1bb50c450afbadf098877398 not found: ID does not exist" Mar 12 12:48:44.671830 master-0 kubenswrapper[13984]: I0312 12:48:44.671815 13984 scope.go:117] "RemoveContainer" containerID="e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80" Mar 12 12:48:44.672159 master-0 kubenswrapper[13984]: I0312 12:48:44.672132 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80"} err="failed to get container status \"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80\": rpc error: code = NotFound desc = could not find container \"e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80\": container with ID starting with e78fa617b4ebacba8a43697cfc0e429291ebd76ea494a85b85de281c50d78d80 not found: ID does not exist" Mar 12 12:48:44.695883 master-0 kubenswrapper[13984]: I0312 12:48:44.694758 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:44.696170 master-0 kubenswrapper[13984]: E0312 12:48:44.696071 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-metadata" Mar 12 12:48:44.696170 master-0 kubenswrapper[13984]: I0312 12:48:44.696094 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-metadata" Mar 12 12:48:44.696170 master-0 kubenswrapper[13984]: E0312 12:48:44.696121 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-log" Mar 12 12:48:44.696170 master-0 kubenswrapper[13984]: I0312 12:48:44.696130 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-log" Mar 12 12:48:44.696906 master-0 kubenswrapper[13984]: I0312 12:48:44.696720 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-log" Mar 12 12:48:44.696906 master-0 kubenswrapper[13984]: I0312 12:48:44.696772 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" containerName="nova-metadata-metadata" Mar 12 12:48:44.699720 master-0 kubenswrapper[13984]: I0312 12:48:44.699691 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:44.704642 master-0 kubenswrapper[13984]: I0312 12:48:44.704420 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 12:48:44.704642 master-0 kubenswrapper[13984]: I0312 12:48:44.704633 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 12:48:44.725973 master-0 kubenswrapper[13984]: I0312 12:48:44.725787 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:44.787904 master-0 kubenswrapper[13984]: I0312 12:48:44.787826 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.788147 master-0 kubenswrapper[13984]: I0312 12:48:44.787935 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-config-data\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.788147 master-0 kubenswrapper[13984]: I0312 12:48:44.788065 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b057e30b-a941-4503-b0fb-e72d5783df6f-logs\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.788240 master-0 kubenswrapper[13984]: I0312 12:48:44.788147 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.788240 master-0 kubenswrapper[13984]: I0312 12:48:44.788214 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5czfr\" (UniqueName: \"kubernetes.io/projected/b057e30b-a941-4503-b0fb-e72d5783df6f-kube-api-access-5czfr\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.891866 master-0 kubenswrapper[13984]: I0312 12:48:44.891394 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.891866 master-0 kubenswrapper[13984]: I0312 12:48:44.891519 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-config-data\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.891866 master-0 kubenswrapper[13984]: I0312 12:48:44.891639 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b057e30b-a941-4503-b0fb-e72d5783df6f-logs\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.891866 master-0 kubenswrapper[13984]: I0312 12:48:44.891722 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.892503 master-0 kubenswrapper[13984]: I0312 12:48:44.892235 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5czfr\" (UniqueName: \"kubernetes.io/projected/b057e30b-a941-4503-b0fb-e72d5783df6f-kube-api-access-5czfr\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.893580 master-0 kubenswrapper[13984]: I0312 12:48:44.893489 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b057e30b-a941-4503-b0fb-e72d5783df6f-logs\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.895560 master-0 kubenswrapper[13984]: I0312 12:48:44.895536 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.897297 master-0 kubenswrapper[13984]: I0312 12:48:44.897260 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-config-data\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.897497 master-0 kubenswrapper[13984]: I0312 12:48:44.897447 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:44.912705 master-0 kubenswrapper[13984]: I0312 12:48:44.912660 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5czfr\" (UniqueName: \"kubernetes.io/projected/b057e30b-a941-4503-b0fb-e72d5783df6f-kube-api-access-5czfr\") pod \"nova-metadata-0\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " pod="openstack/nova-metadata-0" Mar 12 12:48:45.033819 master-0 kubenswrapper[13984]: I0312 12:48:45.033639 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:45.596493 master-0 kubenswrapper[13984]: I0312 12:48:45.596404 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:45.605943 master-0 kubenswrapper[13984]: W0312 12:48:45.605882 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb057e30b_a941_4503_b0fb_e72d5783df6f.slice/crio-61aca3b959737daa89fde3fca5fbad4c4985e894e245ebdee91bdfa34f6b34a9 WatchSource:0}: Error finding container 61aca3b959737daa89fde3fca5fbad4c4985e894e245ebdee91bdfa34f6b34a9: Status 404 returned error can't find the container with id 61aca3b959737daa89fde3fca5fbad4c4985e894e245ebdee91bdfa34f6b34a9 Mar 12 12:48:46.000095 master-0 kubenswrapper[13984]: I0312 12:48:46.000021 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11a6896c-57a5-4e52-8db8-5224f277cfcb" path="/var/lib/kubelet/pods/11a6896c-57a5-4e52-8db8-5224f277cfcb/volumes" Mar 12 12:48:46.609189 master-0 kubenswrapper[13984]: I0312 12:48:46.609109 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b057e30b-a941-4503-b0fb-e72d5783df6f","Type":"ContainerStarted","Data":"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb"} Mar 12 12:48:46.609189 master-0 kubenswrapper[13984]: I0312 12:48:46.609158 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b057e30b-a941-4503-b0fb-e72d5783df6f","Type":"ContainerStarted","Data":"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46"} Mar 12 12:48:46.609189 master-0 kubenswrapper[13984]: I0312 12:48:46.609168 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b057e30b-a941-4503-b0fb-e72d5783df6f","Type":"ContainerStarted","Data":"61aca3b959737daa89fde3fca5fbad4c4985e894e245ebdee91bdfa34f6b34a9"} Mar 12 12:48:47.032076 master-0 kubenswrapper[13984]: I0312 12:48:47.031978 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 12:48:47.032076 master-0 kubenswrapper[13984]: I0312 12:48:47.032055 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 12:48:47.207898 master-0 kubenswrapper[13984]: I0312 12:48:47.207840 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:48:47.224078 master-0 kubenswrapper[13984]: I0312 12:48:47.223926 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 12:48:47.224078 master-0 kubenswrapper[13984]: I0312 12:48:47.223994 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 12:48:47.264618 master-0 kubenswrapper[13984]: I0312 12:48:47.264582 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 12:48:47.673848 master-0 kubenswrapper[13984]: I0312 12:48:47.671034 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.671008692 podStartE2EDuration="3.671008692s" podCreationTimestamp="2026-03-12 12:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:48:47.655343113 +0000 UTC m=+1459.853358615" watchObservedRunningTime="2026-03-12 12:48:47.671008692 +0000 UTC m=+1459.869024194" Mar 12 12:48:47.684399 master-0 kubenswrapper[13984]: I0312 12:48:47.683700 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 12:48:47.703339 master-0 kubenswrapper[13984]: I0312 12:48:47.702745 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:48:47.843914 master-0 kubenswrapper[13984]: I0312 12:48:47.842930 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7897fc7475-gbjfd"] Mar 12 12:48:47.843914 master-0 kubenswrapper[13984]: I0312 12:48:47.843247 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="dnsmasq-dns" containerID="cri-o://e8013f87d8357a28b6f88b55d2368b7b95eee75281404addd1a8208622a8c01e" gracePeriod=10 Mar 12 12:48:48.075828 master-0 kubenswrapper[13984]: I0312 12:48:48.075731 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.0.244:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:48:48.116897 master-0 kubenswrapper[13984]: I0312 12:48:48.116790 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.0.244:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:48:48.195052 master-0 kubenswrapper[13984]: I0312 12:48:48.194938 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.235:5353: connect: connection refused" Mar 12 12:48:50.034771 master-0 kubenswrapper[13984]: I0312 12:48:50.034660 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 12:48:50.035582 master-0 kubenswrapper[13984]: I0312 12:48:50.035119 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 12:48:52.698230 master-0 kubenswrapper[13984]: I0312 12:48:52.696708 13984 generic.go:334] "Generic (PLEG): container finished" podID="ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" containerID="992d46a6738bd355d5fe196354a0271b648731fd3583de6c8c51344d25800e0a" exitCode=0 Mar 12 12:48:52.698230 master-0 kubenswrapper[13984]: I0312 12:48:52.696758 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vd6w8" event={"ID":"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e","Type":"ContainerDied","Data":"992d46a6738bd355d5fe196354a0271b648731fd3583de6c8c51344d25800e0a"} Mar 12 12:48:52.707385 master-0 kubenswrapper[13984]: I0312 12:48:52.707305 13984 generic.go:334] "Generic (PLEG): container finished" podID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerID="e8013f87d8357a28b6f88b55d2368b7b95eee75281404addd1a8208622a8c01e" exitCode=0 Mar 12 12:48:52.707385 master-0 kubenswrapper[13984]: I0312 12:48:52.707378 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" event={"ID":"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545","Type":"ContainerDied","Data":"e8013f87d8357a28b6f88b55d2368b7b95eee75281404addd1a8208622a8c01e"} Mar 12 12:48:53.089721 master-0 kubenswrapper[13984]: I0312 12:48:53.089686 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:48:53.143038 master-0 kubenswrapper[13984]: I0312 12:48:53.142369 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6dk2\" (UniqueName: \"kubernetes.io/projected/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-kube-api-access-w6dk2\") pod \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " Mar 12 12:48:53.143038 master-0 kubenswrapper[13984]: I0312 12:48:53.142659 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-sb\") pod \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " Mar 12 12:48:53.143038 master-0 kubenswrapper[13984]: I0312 12:48:53.142749 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-svc\") pod \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " Mar 12 12:48:53.143038 master-0 kubenswrapper[13984]: I0312 12:48:53.142803 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-config\") pod \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " Mar 12 12:48:53.143038 master-0 kubenswrapper[13984]: I0312 12:48:53.142922 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-nb\") pod \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " Mar 12 12:48:53.143038 master-0 kubenswrapper[13984]: I0312 12:48:53.142987 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-swift-storage-0\") pod \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\" (UID: \"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545\") " Mar 12 12:48:53.172528 master-0 kubenswrapper[13984]: I0312 12:48:53.171781 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-kube-api-access-w6dk2" (OuterVolumeSpecName: "kube-api-access-w6dk2") pod "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" (UID: "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545"). InnerVolumeSpecName "kube-api-access-w6dk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:53.233255 master-0 kubenswrapper[13984]: I0312 12:48:53.233178 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" (UID: "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:53.237097 master-0 kubenswrapper[13984]: I0312 12:48:53.237046 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-config" (OuterVolumeSpecName: "config") pod "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" (UID: "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:53.242294 master-0 kubenswrapper[13984]: I0312 12:48:53.242160 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" (UID: "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:53.244175 master-0 kubenswrapper[13984]: I0312 12:48:53.243713 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" (UID: "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:53.246707 master-0 kubenswrapper[13984]: I0312 12:48:53.246562 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6dk2\" (UniqueName: \"kubernetes.io/projected/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-kube-api-access-w6dk2\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:53.247922 master-0 kubenswrapper[13984]: I0312 12:48:53.247832 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:53.248146 master-0 kubenswrapper[13984]: I0312 12:48:53.247890 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:53.248146 master-0 kubenswrapper[13984]: I0312 12:48:53.248105 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:53.248146 master-0 kubenswrapper[13984]: I0312 12:48:53.248123 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:53.254902 master-0 kubenswrapper[13984]: I0312 12:48:53.254856 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" (UID: "9d1e47c3-a55d-49d9-a7bd-f3c69b32f545"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:48:53.350511 master-0 kubenswrapper[13984]: I0312 12:48:53.350443 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:53.728399 master-0 kubenswrapper[13984]: I0312 12:48:53.728340 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" event={"ID":"9d1e47c3-a55d-49d9-a7bd-f3c69b32f545","Type":"ContainerDied","Data":"f5ea50cb525fd351e84b1ea49512fe7712dda7f65b55251ad63872445842102b"} Mar 12 12:48:53.728999 master-0 kubenswrapper[13984]: I0312 12:48:53.728381 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7897fc7475-gbjfd" Mar 12 12:48:53.728999 master-0 kubenswrapper[13984]: I0312 12:48:53.728439 13984 scope.go:117] "RemoveContainer" containerID="e8013f87d8357a28b6f88b55d2368b7b95eee75281404addd1a8208622a8c01e" Mar 12 12:48:53.732655 master-0 kubenswrapper[13984]: I0312 12:48:53.732593 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"f6d95490-fdae-4ecb-961d-553a7fda1436","Type":"ContainerStarted","Data":"b52f7567d544213fba9b91a40843bfca882dec1e91e92335f13806023f5372a4"} Mar 12 12:48:53.732920 master-0 kubenswrapper[13984]: I0312 12:48:53.732876 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:53.774503 master-0 kubenswrapper[13984]: I0312 12:48:53.773754 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=2.447020643 podStartE2EDuration="17.773732138s" podCreationTimestamp="2026-03-12 12:48:36 +0000 UTC" firstStartedPulling="2026-03-12 12:48:37.451444334 +0000 UTC m=+1449.649459836" lastFinishedPulling="2026-03-12 12:48:52.778155849 +0000 UTC m=+1464.976171331" observedRunningTime="2026-03-12 12:48:53.759015954 +0000 UTC m=+1465.957031456" watchObservedRunningTime="2026-03-12 12:48:53.773732138 +0000 UTC m=+1465.971747650" Mar 12 12:48:53.790199 master-0 kubenswrapper[13984]: I0312 12:48:53.789447 13984 scope.go:117] "RemoveContainer" containerID="9664021e473026c9b519516a59fb0bbcbc39bddc0af1dd2934f2954f6177adf2" Mar 12 12:48:53.821502 master-0 kubenswrapper[13984]: I0312 12:48:53.811118 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 12 12:48:53.840793 master-0 kubenswrapper[13984]: I0312 12:48:53.840677 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7897fc7475-gbjfd"] Mar 12 12:48:53.871652 master-0 kubenswrapper[13984]: I0312 12:48:53.871566 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7897fc7475-gbjfd"] Mar 12 12:48:53.999580 master-0 kubenswrapper[13984]: I0312 12:48:53.999426 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" path="/var/lib/kubelet/pods/9d1e47c3-a55d-49d9-a7bd-f3c69b32f545/volumes" Mar 12 12:48:54.155673 master-0 kubenswrapper[13984]: I0312 12:48:54.155623 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:54.275576 master-0 kubenswrapper[13984]: I0312 12:48:54.274662 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-scripts\") pod \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " Mar 12 12:48:54.275576 master-0 kubenswrapper[13984]: I0312 12:48:54.274965 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdmzn\" (UniqueName: \"kubernetes.io/projected/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-kube-api-access-bdmzn\") pod \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " Mar 12 12:48:54.275576 master-0 kubenswrapper[13984]: I0312 12:48:54.274998 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-config-data\") pod \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " Mar 12 12:48:54.275576 master-0 kubenswrapper[13984]: I0312 12:48:54.275096 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-combined-ca-bundle\") pod \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\" (UID: \"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e\") " Mar 12 12:48:54.287503 master-0 kubenswrapper[13984]: I0312 12:48:54.280686 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-kube-api-access-bdmzn" (OuterVolumeSpecName: "kube-api-access-bdmzn") pod "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" (UID: "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e"). InnerVolumeSpecName "kube-api-access-bdmzn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:54.294600 master-0 kubenswrapper[13984]: I0312 12:48:54.292943 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-scripts" (OuterVolumeSpecName: "scripts") pod "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" (UID: "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:54.301719 master-0 kubenswrapper[13984]: I0312 12:48:54.301646 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-config-data" (OuterVolumeSpecName: "config-data") pod "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" (UID: "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:54.329441 master-0 kubenswrapper[13984]: I0312 12:48:54.329366 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" (UID: "ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:54.378618 master-0 kubenswrapper[13984]: I0312 12:48:54.378540 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdmzn\" (UniqueName: \"kubernetes.io/projected/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-kube-api-access-bdmzn\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:54.378618 master-0 kubenswrapper[13984]: I0312 12:48:54.378597 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:54.378618 master-0 kubenswrapper[13984]: I0312 12:48:54.378610 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:54.378618 master-0 kubenswrapper[13984]: I0312 12:48:54.378620 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:54.756530 master-0 kubenswrapper[13984]: I0312 12:48:54.756434 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vd6w8" Mar 12 12:48:54.757636 master-0 kubenswrapper[13984]: I0312 12:48:54.756445 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vd6w8" event={"ID":"ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e","Type":"ContainerDied","Data":"8d317ace1c7b737a2ed39a0186f8f2fceef4ca1dad786ae42394415e40ce5f98"} Mar 12 12:48:54.757636 master-0 kubenswrapper[13984]: I0312 12:48:54.756625 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d317ace1c7b737a2ed39a0186f8f2fceef4ca1dad786ae42394415e40ce5f98" Mar 12 12:48:54.923969 master-0 kubenswrapper[13984]: I0312 12:48:54.923895 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:54.924238 master-0 kubenswrapper[13984]: I0312 12:48:54.924146 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-log" containerID="cri-o://8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242" gracePeriod=30 Mar 12 12:48:54.924337 master-0 kubenswrapper[13984]: I0312 12:48:54.924243 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-api" containerID="cri-o://ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0" gracePeriod=30 Mar 12 12:48:54.966752 master-0 kubenswrapper[13984]: I0312 12:48:54.962696 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:48:54.966752 master-0 kubenswrapper[13984]: I0312 12:48:54.963000 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="be2c270c-f966-42c6-858d-7392b2902c0d" containerName="nova-scheduler-scheduler" containerID="cri-o://62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" gracePeriod=30 Mar 12 12:48:54.982550 master-0 kubenswrapper[13984]: I0312 12:48:54.982430 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:54.983108 master-0 kubenswrapper[13984]: I0312 12:48:54.982786 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-log" containerID="cri-o://471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46" gracePeriod=30 Mar 12 12:48:54.983108 master-0 kubenswrapper[13984]: I0312 12:48:54.982817 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-metadata" containerID="cri-o://a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb" gracePeriod=30 Mar 12 12:48:55.579568 master-0 kubenswrapper[13984]: I0312 12:48:55.579372 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:55.634220 master-0 kubenswrapper[13984]: I0312 12:48:55.634027 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5czfr\" (UniqueName: \"kubernetes.io/projected/b057e30b-a941-4503-b0fb-e72d5783df6f-kube-api-access-5czfr\") pod \"b057e30b-a941-4503-b0fb-e72d5783df6f\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " Mar 12 12:48:55.634435 master-0 kubenswrapper[13984]: I0312 12:48:55.634239 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-config-data\") pod \"b057e30b-a941-4503-b0fb-e72d5783df6f\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " Mar 12 12:48:55.634435 master-0 kubenswrapper[13984]: I0312 12:48:55.634394 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-nova-metadata-tls-certs\") pod \"b057e30b-a941-4503-b0fb-e72d5783df6f\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " Mar 12 12:48:55.634562 master-0 kubenswrapper[13984]: I0312 12:48:55.634432 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-combined-ca-bundle\") pod \"b057e30b-a941-4503-b0fb-e72d5783df6f\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " Mar 12 12:48:55.634562 master-0 kubenswrapper[13984]: I0312 12:48:55.634507 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b057e30b-a941-4503-b0fb-e72d5783df6f-logs\") pod \"b057e30b-a941-4503-b0fb-e72d5783df6f\" (UID: \"b057e30b-a941-4503-b0fb-e72d5783df6f\") " Mar 12 12:48:55.635013 master-0 kubenswrapper[13984]: I0312 12:48:55.634949 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b057e30b-a941-4503-b0fb-e72d5783df6f-logs" (OuterVolumeSpecName: "logs") pod "b057e30b-a941-4503-b0fb-e72d5783df6f" (UID: "b057e30b-a941-4503-b0fb-e72d5783df6f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:48:55.635408 master-0 kubenswrapper[13984]: I0312 12:48:55.635366 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b057e30b-a941-4503-b0fb-e72d5783df6f-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:55.638872 master-0 kubenswrapper[13984]: I0312 12:48:55.638818 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b057e30b-a941-4503-b0fb-e72d5783df6f-kube-api-access-5czfr" (OuterVolumeSpecName: "kube-api-access-5czfr") pod "b057e30b-a941-4503-b0fb-e72d5783df6f" (UID: "b057e30b-a941-4503-b0fb-e72d5783df6f"). InnerVolumeSpecName "kube-api-access-5czfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:55.671496 master-0 kubenswrapper[13984]: I0312 12:48:55.671323 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-config-data" (OuterVolumeSpecName: "config-data") pod "b057e30b-a941-4503-b0fb-e72d5783df6f" (UID: "b057e30b-a941-4503-b0fb-e72d5783df6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:55.674546 master-0 kubenswrapper[13984]: I0312 12:48:55.673267 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b057e30b-a941-4503-b0fb-e72d5783df6f" (UID: "b057e30b-a941-4503-b0fb-e72d5783df6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:55.703210 master-0 kubenswrapper[13984]: I0312 12:48:55.703131 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b057e30b-a941-4503-b0fb-e72d5783df6f" (UID: "b057e30b-a941-4503-b0fb-e72d5783df6f"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:55.737558 master-0 kubenswrapper[13984]: I0312 12:48:55.737467 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:55.737558 master-0 kubenswrapper[13984]: I0312 12:48:55.737549 13984 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:55.737558 master-0 kubenswrapper[13984]: I0312 12:48:55.737561 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b057e30b-a941-4503-b0fb-e72d5783df6f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:55.737558 master-0 kubenswrapper[13984]: I0312 12:48:55.737570 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5czfr\" (UniqueName: \"kubernetes.io/projected/b057e30b-a941-4503-b0fb-e72d5783df6f-kube-api-access-5czfr\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:55.830033 master-0 kubenswrapper[13984]: I0312 12:48:55.829906 13984 generic.go:334] "Generic (PLEG): container finished" podID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerID="a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb" exitCode=0 Mar 12 12:48:55.830033 master-0 kubenswrapper[13984]: I0312 12:48:55.829947 13984 generic.go:334] "Generic (PLEG): container finished" podID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerID="471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46" exitCode=143 Mar 12 12:48:55.830033 master-0 kubenswrapper[13984]: I0312 12:48:55.829991 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b057e30b-a941-4503-b0fb-e72d5783df6f","Type":"ContainerDied","Data":"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb"} Mar 12 12:48:55.830033 master-0 kubenswrapper[13984]: I0312 12:48:55.830020 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b057e30b-a941-4503-b0fb-e72d5783df6f","Type":"ContainerDied","Data":"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46"} Mar 12 12:48:55.830033 master-0 kubenswrapper[13984]: I0312 12:48:55.830031 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b057e30b-a941-4503-b0fb-e72d5783df6f","Type":"ContainerDied","Data":"61aca3b959737daa89fde3fca5fbad4c4985e894e245ebdee91bdfa34f6b34a9"} Mar 12 12:48:55.831256 master-0 kubenswrapper[13984]: I0312 12:48:55.830049 13984 scope.go:117] "RemoveContainer" containerID="a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb" Mar 12 12:48:55.831256 master-0 kubenswrapper[13984]: I0312 12:48:55.830176 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:55.852609 master-0 kubenswrapper[13984]: I0312 12:48:55.852491 13984 generic.go:334] "Generic (PLEG): container finished" podID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerID="8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242" exitCode=143 Mar 12 12:48:55.853403 master-0 kubenswrapper[13984]: I0312 12:48:55.853349 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e406a2b-18fa-448b-888e-9e6a95bbe747","Type":"ContainerDied","Data":"8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242"} Mar 12 12:48:55.892205 master-0 kubenswrapper[13984]: I0312 12:48:55.892012 13984 scope.go:117] "RemoveContainer" containerID="471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46" Mar 12 12:48:55.894358 master-0 kubenswrapper[13984]: I0312 12:48:55.894304 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:55.915967 master-0 kubenswrapper[13984]: I0312 12:48:55.915449 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.946784 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: E0312 12:48:55.947470 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" containerName="nova-manage" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.947511 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" containerName="nova-manage" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: E0312 12:48:55.947564 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="dnsmasq-dns" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.947574 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="dnsmasq-dns" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: E0312 12:48:55.947595 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-metadata" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.947603 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-metadata" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: E0312 12:48:55.947631 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-log" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.947639 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-log" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: E0312 12:48:55.947654 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="init" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.947663 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="init" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.947984 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-log" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.948012 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e" containerName="nova-manage" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.948034 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d1e47c3-a55d-49d9-a7bd-f3c69b32f545" containerName="dnsmasq-dns" Mar 12 12:48:55.950849 master-0 kubenswrapper[13984]: I0312 12:48:55.948059 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" containerName="nova-metadata-metadata" Mar 12 12:48:55.953114 master-0 kubenswrapper[13984]: I0312 12:48:55.952382 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:55.966462 master-0 kubenswrapper[13984]: I0312 12:48:55.965322 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 12:48:55.966462 master-0 kubenswrapper[13984]: I0312 12:48:55.965423 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 12:48:55.988039 master-0 kubenswrapper[13984]: I0312 12:48:55.977983 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:55.988039 master-0 kubenswrapper[13984]: I0312 12:48:55.981122 13984 scope.go:117] "RemoveContainer" containerID="a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb" Mar 12 12:48:55.989123 master-0 kubenswrapper[13984]: E0312 12:48:55.988807 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb\": container with ID starting with a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb not found: ID does not exist" containerID="a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb" Mar 12 12:48:55.989123 master-0 kubenswrapper[13984]: I0312 12:48:55.988902 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb"} err="failed to get container status \"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb\": rpc error: code = NotFound desc = could not find container \"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb\": container with ID starting with a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb not found: ID does not exist" Mar 12 12:48:55.989123 master-0 kubenswrapper[13984]: I0312 12:48:55.988948 13984 scope.go:117] "RemoveContainer" containerID="471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46" Mar 12 12:48:56.006519 master-0 kubenswrapper[13984]: E0312 12:48:55.995147 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46\": container with ID starting with 471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46 not found: ID does not exist" containerID="471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46" Mar 12 12:48:56.006519 master-0 kubenswrapper[13984]: I0312 12:48:56.006168 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46"} err="failed to get container status \"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46\": rpc error: code = NotFound desc = could not find container \"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46\": container with ID starting with 471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46 not found: ID does not exist" Mar 12 12:48:56.006519 master-0 kubenswrapper[13984]: I0312 12:48:56.005850 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b057e30b-a941-4503-b0fb-e72d5783df6f" path="/var/lib/kubelet/pods/b057e30b-a941-4503-b0fb-e72d5783df6f/volumes" Mar 12 12:48:56.006519 master-0 kubenswrapper[13984]: I0312 12:48:56.006261 13984 scope.go:117] "RemoveContainer" containerID="a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb" Mar 12 12:48:56.007060 master-0 kubenswrapper[13984]: I0312 12:48:56.006998 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb"} err="failed to get container status \"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb\": rpc error: code = NotFound desc = could not find container \"a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb\": container with ID starting with a3e941f34824773b0185602890d4a33b45c2a54167282b269fcdc36b6911c7bb not found: ID does not exist" Mar 12 12:48:56.007117 master-0 kubenswrapper[13984]: I0312 12:48:56.007066 13984 scope.go:117] "RemoveContainer" containerID="471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46" Mar 12 12:48:56.007835 master-0 kubenswrapper[13984]: I0312 12:48:56.007429 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46"} err="failed to get container status \"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46\": rpc error: code = NotFound desc = could not find container \"471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46\": container with ID starting with 471e5ed5f414596b3e7b27eb13aa7b09a4e76b6e324588aa516b055f30164f46 not found: ID does not exist" Mar 12 12:48:56.075830 master-0 kubenswrapper[13984]: I0312 12:48:56.075755 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2132b-24d6-45c5-b5ce-728ac475768a-logs\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.076076 master-0 kubenswrapper[13984]: I0312 12:48:56.075871 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.076076 master-0 kubenswrapper[13984]: I0312 12:48:56.075902 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-config-data\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.076076 master-0 kubenswrapper[13984]: I0312 12:48:56.075943 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.076076 master-0 kubenswrapper[13984]: I0312 12:48:56.076005 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2gkg\" (UniqueName: \"kubernetes.io/projected/04e2132b-24d6-45c5-b5ce-728ac475768a-kube-api-access-n2gkg\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.179707 master-0 kubenswrapper[13984]: I0312 12:48:56.179560 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2132b-24d6-45c5-b5ce-728ac475768a-logs\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.179707 master-0 kubenswrapper[13984]: I0312 12:48:56.178825 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2132b-24d6-45c5-b5ce-728ac475768a-logs\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.179984 master-0 kubenswrapper[13984]: I0312 12:48:56.179874 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.180842 master-0 kubenswrapper[13984]: I0312 12:48:56.180784 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-config-data\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.180987 master-0 kubenswrapper[13984]: I0312 12:48:56.180929 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.181264 master-0 kubenswrapper[13984]: I0312 12:48:56.181224 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n2gkg\" (UniqueName: \"kubernetes.io/projected/04e2132b-24d6-45c5-b5ce-728ac475768a-kube-api-access-n2gkg\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.184455 master-0 kubenswrapper[13984]: I0312 12:48:56.183800 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.189388 master-0 kubenswrapper[13984]: I0312 12:48:56.186431 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-config-data\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.195511 master-0 kubenswrapper[13984]: I0312 12:48:56.190464 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.202499 master-0 kubenswrapper[13984]: I0312 12:48:56.199508 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2gkg\" (UniqueName: \"kubernetes.io/projected/04e2132b-24d6-45c5-b5ce-728ac475768a-kube-api-access-n2gkg\") pod \"nova-metadata-0\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " pod="openstack/nova-metadata-0" Mar 12 12:48:56.329222 master-0 kubenswrapper[13984]: I0312 12:48:56.327875 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:48:56.812697 master-0 kubenswrapper[13984]: I0312 12:48:56.812653 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:48:56.871576 master-0 kubenswrapper[13984]: I0312 12:48:56.871520 13984 generic.go:334] "Generic (PLEG): container finished" podID="ea457152-df63-450a-be0c-aaca7e72b0b4" containerID="7c06ae70f62ae63073364d3a235d96da95cf089bef58484ccc550a4d314281e0" exitCode=0 Mar 12 12:48:56.877058 master-0 kubenswrapper[13984]: I0312 12:48:56.871604 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rf48n" event={"ID":"ea457152-df63-450a-be0c-aaca7e72b0b4","Type":"ContainerDied","Data":"7c06ae70f62ae63073364d3a235d96da95cf089bef58484ccc550a4d314281e0"} Mar 12 12:48:56.877058 master-0 kubenswrapper[13984]: I0312 12:48:56.874914 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04e2132b-24d6-45c5-b5ce-728ac475768a","Type":"ContainerStarted","Data":"5562e65b31e97a9c6eb120c32b4647e8feaaafcc10286c8eecf918c6ab7bedae"} Mar 12 12:48:57.226162 master-0 kubenswrapper[13984]: E0312 12:48:57.226086 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 12:48:57.229177 master-0 kubenswrapper[13984]: E0312 12:48:57.229107 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 12:48:57.231436 master-0 kubenswrapper[13984]: E0312 12:48:57.231382 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 12:48:57.231533 master-0 kubenswrapper[13984]: E0312 12:48:57.231445 13984 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="be2c270c-f966-42c6-858d-7392b2902c0d" containerName="nova-scheduler-scheduler" Mar 12 12:48:57.891701 master-0 kubenswrapper[13984]: I0312 12:48:57.891631 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04e2132b-24d6-45c5-b5ce-728ac475768a","Type":"ContainerStarted","Data":"1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f"} Mar 12 12:48:57.891701 master-0 kubenswrapper[13984]: I0312 12:48:57.891700 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04e2132b-24d6-45c5-b5ce-728ac475768a","Type":"ContainerStarted","Data":"681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4"} Mar 12 12:48:57.979235 master-0 kubenswrapper[13984]: I0312 12:48:57.979102 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.979077335 podStartE2EDuration="2.979077335s" podCreationTimestamp="2026-03-12 12:48:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:48:57.97161033 +0000 UTC m=+1470.169625822" watchObservedRunningTime="2026-03-12 12:48:57.979077335 +0000 UTC m=+1470.177092827" Mar 12 12:48:58.359371 master-0 kubenswrapper[13984]: I0312 12:48:58.359302 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:58.458998 master-0 kubenswrapper[13984]: I0312 12:48:58.458935 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-combined-ca-bundle\") pod \"ea457152-df63-450a-be0c-aaca7e72b0b4\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " Mar 12 12:48:58.459230 master-0 kubenswrapper[13984]: I0312 12:48:58.459133 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2jt2\" (UniqueName: \"kubernetes.io/projected/ea457152-df63-450a-be0c-aaca7e72b0b4-kube-api-access-n2jt2\") pod \"ea457152-df63-450a-be0c-aaca7e72b0b4\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " Mar 12 12:48:58.459230 master-0 kubenswrapper[13984]: I0312 12:48:58.459159 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-config-data\") pod \"ea457152-df63-450a-be0c-aaca7e72b0b4\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " Mar 12 12:48:58.459334 master-0 kubenswrapper[13984]: I0312 12:48:58.459236 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-scripts\") pod \"ea457152-df63-450a-be0c-aaca7e72b0b4\" (UID: \"ea457152-df63-450a-be0c-aaca7e72b0b4\") " Mar 12 12:48:58.463203 master-0 kubenswrapper[13984]: I0312 12:48:58.463128 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-scripts" (OuterVolumeSpecName: "scripts") pod "ea457152-df63-450a-be0c-aaca7e72b0b4" (UID: "ea457152-df63-450a-be0c-aaca7e72b0b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:58.474857 master-0 kubenswrapper[13984]: I0312 12:48:58.474807 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea457152-df63-450a-be0c-aaca7e72b0b4-kube-api-access-n2jt2" (OuterVolumeSpecName: "kube-api-access-n2jt2") pod "ea457152-df63-450a-be0c-aaca7e72b0b4" (UID: "ea457152-df63-450a-be0c-aaca7e72b0b4"). InnerVolumeSpecName "kube-api-access-n2jt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:58.491485 master-0 kubenswrapper[13984]: I0312 12:48:58.491408 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-config-data" (OuterVolumeSpecName: "config-data") pod "ea457152-df63-450a-be0c-aaca7e72b0b4" (UID: "ea457152-df63-450a-be0c-aaca7e72b0b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:58.500525 master-0 kubenswrapper[13984]: I0312 12:48:58.500455 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ea457152-df63-450a-be0c-aaca7e72b0b4" (UID: "ea457152-df63-450a-be0c-aaca7e72b0b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:58.564363 master-0 kubenswrapper[13984]: I0312 12:48:58.564286 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.564363 master-0 kubenswrapper[13984]: I0312 12:48:58.564338 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2jt2\" (UniqueName: \"kubernetes.io/projected/ea457152-df63-450a-be0c-aaca7e72b0b4-kube-api-access-n2jt2\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.564363 master-0 kubenswrapper[13984]: I0312 12:48:58.564350 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.564363 master-0 kubenswrapper[13984]: I0312 12:48:58.564360 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ea457152-df63-450a-be0c-aaca7e72b0b4-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.628790 master-0 kubenswrapper[13984]: I0312 12:48:58.628755 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:48:58.770707 master-0 kubenswrapper[13984]: I0312 12:48:58.770543 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-combined-ca-bundle\") pod \"0e406a2b-18fa-448b-888e-9e6a95bbe747\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " Mar 12 12:48:58.770896 master-0 kubenswrapper[13984]: I0312 12:48:58.770740 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e406a2b-18fa-448b-888e-9e6a95bbe747-logs\") pod \"0e406a2b-18fa-448b-888e-9e6a95bbe747\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " Mar 12 12:48:58.770896 master-0 kubenswrapper[13984]: I0312 12:48:58.770846 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-config-data\") pod \"0e406a2b-18fa-448b-888e-9e6a95bbe747\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " Mar 12 12:48:58.771104 master-0 kubenswrapper[13984]: I0312 12:48:58.771070 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2ss8\" (UniqueName: \"kubernetes.io/projected/0e406a2b-18fa-448b-888e-9e6a95bbe747-kube-api-access-b2ss8\") pod \"0e406a2b-18fa-448b-888e-9e6a95bbe747\" (UID: \"0e406a2b-18fa-448b-888e-9e6a95bbe747\") " Mar 12 12:48:58.771626 master-0 kubenswrapper[13984]: I0312 12:48:58.771399 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e406a2b-18fa-448b-888e-9e6a95bbe747-logs" (OuterVolumeSpecName: "logs") pod "0e406a2b-18fa-448b-888e-9e6a95bbe747" (UID: "0e406a2b-18fa-448b-888e-9e6a95bbe747"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:48:58.772374 master-0 kubenswrapper[13984]: I0312 12:48:58.772329 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0e406a2b-18fa-448b-888e-9e6a95bbe747-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.773720 master-0 kubenswrapper[13984]: I0312 12:48:58.773671 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e406a2b-18fa-448b-888e-9e6a95bbe747-kube-api-access-b2ss8" (OuterVolumeSpecName: "kube-api-access-b2ss8") pod "0e406a2b-18fa-448b-888e-9e6a95bbe747" (UID: "0e406a2b-18fa-448b-888e-9e6a95bbe747"). InnerVolumeSpecName "kube-api-access-b2ss8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:48:58.797119 master-0 kubenswrapper[13984]: I0312 12:48:58.797070 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-config-data" (OuterVolumeSpecName: "config-data") pod "0e406a2b-18fa-448b-888e-9e6a95bbe747" (UID: "0e406a2b-18fa-448b-888e-9e6a95bbe747"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:58.799748 master-0 kubenswrapper[13984]: I0312 12:48:58.799718 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e406a2b-18fa-448b-888e-9e6a95bbe747" (UID: "0e406a2b-18fa-448b-888e-9e6a95bbe747"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:48:58.875111 master-0 kubenswrapper[13984]: I0312 12:48:58.875036 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.875111 master-0 kubenswrapper[13984]: I0312 12:48:58.875107 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2ss8\" (UniqueName: \"kubernetes.io/projected/0e406a2b-18fa-448b-888e-9e6a95bbe747-kube-api-access-b2ss8\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.875321 master-0 kubenswrapper[13984]: I0312 12:48:58.875128 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e406a2b-18fa-448b-888e-9e6a95bbe747-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:48:58.905658 master-0 kubenswrapper[13984]: I0312 12:48:58.905597 13984 generic.go:334] "Generic (PLEG): container finished" podID="43e6fb75-b813-4074-8029-d6817b1bb9e2" containerID="e3b5768c37d0235b52cac894792cae6af357605cab7eb9e9b9c19515dd1cd74d" exitCode=0 Mar 12 12:48:58.907158 master-0 kubenswrapper[13984]: I0312 12:48:58.905683 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerDied","Data":"e3b5768c37d0235b52cac894792cae6af357605cab7eb9e9b9c19515dd1cd74d"} Mar 12 12:48:58.908179 master-0 kubenswrapper[13984]: I0312 12:48:58.908119 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rf48n" event={"ID":"ea457152-df63-450a-be0c-aaca7e72b0b4","Type":"ContainerDied","Data":"f74e11103f78bc62152200e9759a500cab51afe893968503fbed0e326fa8282a"} Mar 12 12:48:58.908245 master-0 kubenswrapper[13984]: I0312 12:48:58.908178 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rf48n" Mar 12 12:48:58.908803 master-0 kubenswrapper[13984]: I0312 12:48:58.908185 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f74e11103f78bc62152200e9759a500cab51afe893968503fbed0e326fa8282a" Mar 12 12:48:58.911593 master-0 kubenswrapper[13984]: I0312 12:48:58.910824 13984 generic.go:334] "Generic (PLEG): container finished" podID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerID="ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0" exitCode=0 Mar 12 12:48:58.911593 master-0 kubenswrapper[13984]: I0312 12:48:58.911075 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e406a2b-18fa-448b-888e-9e6a95bbe747","Type":"ContainerDied","Data":"ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0"} Mar 12 12:48:58.911593 master-0 kubenswrapper[13984]: I0312 12:48:58.911117 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"0e406a2b-18fa-448b-888e-9e6a95bbe747","Type":"ContainerDied","Data":"62b859158088b79415fb2c90f02bb96f92d7a40ad3ea82146f3b1dfb9252e154"} Mar 12 12:48:58.911593 master-0 kubenswrapper[13984]: I0312 12:48:58.911131 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:48:58.911593 master-0 kubenswrapper[13984]: I0312 12:48:58.911135 13984 scope.go:117] "RemoveContainer" containerID="ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0" Mar 12 12:48:58.940445 master-0 kubenswrapper[13984]: I0312 12:48:58.940400 13984 scope.go:117] "RemoveContainer" containerID="8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242" Mar 12 12:48:58.993648 master-0 kubenswrapper[13984]: I0312 12:48:58.993603 13984 scope.go:117] "RemoveContainer" containerID="ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0" Mar 12 12:48:58.994166 master-0 kubenswrapper[13984]: E0312 12:48:58.994120 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0\": container with ID starting with ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0 not found: ID does not exist" containerID="ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0" Mar 12 12:48:58.994220 master-0 kubenswrapper[13984]: I0312 12:48:58.994170 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0"} err="failed to get container status \"ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0\": rpc error: code = NotFound desc = could not find container \"ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0\": container with ID starting with ceeab3ddbaad298c0a17769d175fab7fcc39ab2552bbc70cdb5f3e42eaf416e0 not found: ID does not exist" Mar 12 12:48:58.994220 master-0 kubenswrapper[13984]: I0312 12:48:58.994198 13984 scope.go:117] "RemoveContainer" containerID="8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242" Mar 12 12:48:58.994634 master-0 kubenswrapper[13984]: E0312 12:48:58.994606 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242\": container with ID starting with 8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242 not found: ID does not exist" containerID="8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242" Mar 12 12:48:58.994684 master-0 kubenswrapper[13984]: I0312 12:48:58.994629 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242"} err="failed to get container status \"8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242\": rpc error: code = NotFound desc = could not find container \"8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242\": container with ID starting with 8a47c5ed87092025afcd7a71e32b492dc701f3076a5b665608f5df4c47d50242 not found: ID does not exist" Mar 12 12:48:59.086586 master-0 kubenswrapper[13984]: I0312 12:48:59.084391 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:59.225498 master-0 kubenswrapper[13984]: I0312 12:48:59.221562 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:59.234592 master-0 kubenswrapper[13984]: I0312 12:48:59.234532 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: E0312 12:48:59.235117 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea457152-df63-450a-be0c-aaca7e72b0b4" containerName="nova-cell1-conductor-db-sync" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.235140 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea457152-df63-450a-be0c-aaca7e72b0b4" containerName="nova-cell1-conductor-db-sync" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: E0312 12:48:59.235190 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-api" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.235197 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-api" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: E0312 12:48:59.235229 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-log" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.235237 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-log" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.235459 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea457152-df63-450a-be0c-aaca7e72b0b4" containerName="nova-cell1-conductor-db-sync" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.235502 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-api" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.235549 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" containerName="nova-api-log" Mar 12 12:48:59.238490 master-0 kubenswrapper[13984]: I0312 12:48:59.236833 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:48:59.248530 master-0 kubenswrapper[13984]: I0312 12:48:59.246299 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 12:48:59.248530 master-0 kubenswrapper[13984]: I0312 12:48:59.248102 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.250545 master-0 kubenswrapper[13984]: I0312 12:48:59.250514 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 12:48:59.250775 master-0 kubenswrapper[13984]: I0312 12:48:59.250754 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 12 12:48:59.288529 master-0 kubenswrapper[13984]: I0312 12:48:59.284188 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7d8x4\" (UniqueName: \"kubernetes.io/projected/458d63ec-daa2-41a3-903e-08d1a10c29df-kube-api-access-7d8x4\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.288529 master-0 kubenswrapper[13984]: I0312 12:48:59.284358 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-config-data\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.288529 master-0 kubenswrapper[13984]: I0312 12:48:59.284501 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458d63ec-daa2-41a3-903e-08d1a10c29df-logs\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.288529 master-0 kubenswrapper[13984]: I0312 12:48:59.284654 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.294025 master-0 kubenswrapper[13984]: I0312 12:48:59.292803 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:48:59.401014 master-0 kubenswrapper[13984]: I0312 12:48:59.400892 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f9b44e-10c5-46d9-8091-006ca18b30ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.401014 master-0 kubenswrapper[13984]: I0312 12:48:59.400980 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7d8x4\" (UniqueName: \"kubernetes.io/projected/458d63ec-daa2-41a3-903e-08d1a10c29df-kube-api-access-7d8x4\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.401248 master-0 kubenswrapper[13984]: I0312 12:48:59.401038 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-config-data\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.401248 master-0 kubenswrapper[13984]: I0312 12:48:59.401084 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nt7c\" (UniqueName: \"kubernetes.io/projected/67f9b44e-10c5-46d9-8091-006ca18b30ac-kube-api-access-9nt7c\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.401248 master-0 kubenswrapper[13984]: I0312 12:48:59.401192 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458d63ec-daa2-41a3-903e-08d1a10c29df-logs\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.401364 master-0 kubenswrapper[13984]: I0312 12:48:59.401261 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f9b44e-10c5-46d9-8091-006ca18b30ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.401364 master-0 kubenswrapper[13984]: I0312 12:48:59.401301 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.403021 master-0 kubenswrapper[13984]: I0312 12:48:59.402997 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458d63ec-daa2-41a3-903e-08d1a10c29df-logs\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.447506 master-0 kubenswrapper[13984]: I0312 12:48:59.447441 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.452332 master-0 kubenswrapper[13984]: I0312 12:48:59.452293 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-config-data\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.460570 master-0 kubenswrapper[13984]: I0312 12:48:59.460530 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7d8x4\" (UniqueName: \"kubernetes.io/projected/458d63ec-daa2-41a3-903e-08d1a10c29df-kube-api-access-7d8x4\") pod \"nova-api-0\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " pod="openstack/nova-api-0" Mar 12 12:48:59.506228 master-0 kubenswrapper[13984]: I0312 12:48:59.505133 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nt7c\" (UniqueName: \"kubernetes.io/projected/67f9b44e-10c5-46d9-8091-006ca18b30ac-kube-api-access-9nt7c\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.506228 master-0 kubenswrapper[13984]: I0312 12:48:59.505372 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f9b44e-10c5-46d9-8091-006ca18b30ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.506228 master-0 kubenswrapper[13984]: I0312 12:48:59.505448 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f9b44e-10c5-46d9-8091-006ca18b30ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.507305 master-0 kubenswrapper[13984]: I0312 12:48:59.506667 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 12:48:59.512606 master-0 kubenswrapper[13984]: I0312 12:48:59.511113 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67f9b44e-10c5-46d9-8091-006ca18b30ac-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.527338 master-0 kubenswrapper[13984]: I0312 12:48:59.527294 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67f9b44e-10c5-46d9-8091-006ca18b30ac-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.537499 master-0 kubenswrapper[13984]: I0312 12:48:59.534489 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nt7c\" (UniqueName: \"kubernetes.io/projected/67f9b44e-10c5-46d9-8091-006ca18b30ac-kube-api-access-9nt7c\") pod \"nova-cell1-conductor-0\" (UID: \"67f9b44e-10c5-46d9-8091-006ca18b30ac\") " pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.587497 master-0 kubenswrapper[13984]: I0312 12:48:59.584861 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:48:59.631812 master-0 kubenswrapper[13984]: I0312 12:48:59.631754 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 12 12:48:59.937171 master-0 kubenswrapper[13984]: I0312 12:48:59.937121 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"768b74c64befd5495c135cbe0bf2ba508f9985e43f9322dca51470bedbeb65cf"} Mar 12 12:49:00.013770 master-0 kubenswrapper[13984]: I0312 12:49:00.013596 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e406a2b-18fa-448b-888e-9e6a95bbe747" path="/var/lib/kubelet/pods/0e406a2b-18fa-448b-888e-9e6a95bbe747/volumes" Mar 12 12:49:00.377733 master-0 kubenswrapper[13984]: I0312 12:49:00.377698 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:00.579766 master-0 kubenswrapper[13984]: I0312 12:49:00.579360 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:00.635369 master-0 kubenswrapper[13984]: I0312 12:49:00.635317 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-combined-ca-bundle\") pod \"be2c270c-f966-42c6-858d-7392b2902c0d\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " Mar 12 12:49:00.635587 master-0 kubenswrapper[13984]: I0312 12:49:00.635465 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-config-data\") pod \"be2c270c-f966-42c6-858d-7392b2902c0d\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " Mar 12 12:49:00.636434 master-0 kubenswrapper[13984]: I0312 12:49:00.636390 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9dqb\" (UniqueName: \"kubernetes.io/projected/be2c270c-f966-42c6-858d-7392b2902c0d-kube-api-access-z9dqb\") pod \"be2c270c-f966-42c6-858d-7392b2902c0d\" (UID: \"be2c270c-f966-42c6-858d-7392b2902c0d\") " Mar 12 12:49:00.639498 master-0 kubenswrapper[13984]: I0312 12:49:00.639416 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be2c270c-f966-42c6-858d-7392b2902c0d-kube-api-access-z9dqb" (OuterVolumeSpecName: "kube-api-access-z9dqb") pod "be2c270c-f966-42c6-858d-7392b2902c0d" (UID: "be2c270c-f966-42c6-858d-7392b2902c0d"). InnerVolumeSpecName "kube-api-access-z9dqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:00.663802 master-0 kubenswrapper[13984]: I0312 12:49:00.663734 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be2c270c-f966-42c6-858d-7392b2902c0d" (UID: "be2c270c-f966-42c6-858d-7392b2902c0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:00.673288 master-0 kubenswrapper[13984]: I0312 12:49:00.673227 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-config-data" (OuterVolumeSpecName: "config-data") pod "be2c270c-f966-42c6-858d-7392b2902c0d" (UID: "be2c270c-f966-42c6-858d-7392b2902c0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:00.739297 master-0 kubenswrapper[13984]: I0312 12:49:00.739116 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:00.739297 master-0 kubenswrapper[13984]: I0312 12:49:00.739217 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be2c270c-f966-42c6-858d-7392b2902c0d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:00.739297 master-0 kubenswrapper[13984]: I0312 12:49:00.739230 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9dqb\" (UniqueName: \"kubernetes.io/projected/be2c270c-f966-42c6-858d-7392b2902c0d-kube-api-access-z9dqb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:00.887941 master-0 kubenswrapper[13984]: I0312 12:49:00.887152 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 12 12:49:00.953732 master-0 kubenswrapper[13984]: I0312 12:49:00.951593 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67f9b44e-10c5-46d9-8091-006ca18b30ac","Type":"ContainerStarted","Data":"55748250fa7212c4f5a37a4917908281a638b4b4aaca3dfce09495494f5b67f5"} Mar 12 12:49:00.959971 master-0 kubenswrapper[13984]: I0312 12:49:00.959518 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"997890bf4604b1a6eb26fa9704f76d2f657a290e3c77478c4b0837b32c3c11f0"} Mar 12 12:49:00.964206 master-0 kubenswrapper[13984]: I0312 12:49:00.963947 13984 generic.go:334] "Generic (PLEG): container finished" podID="be2c270c-f966-42c6-858d-7392b2902c0d" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" exitCode=0 Mar 12 12:49:00.964206 master-0 kubenswrapper[13984]: I0312 12:49:00.964037 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be2c270c-f966-42c6-858d-7392b2902c0d","Type":"ContainerDied","Data":"62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5"} Mar 12 12:49:00.964740 master-0 kubenswrapper[13984]: I0312 12:49:00.964045 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:00.964789 master-0 kubenswrapper[13984]: I0312 12:49:00.964065 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be2c270c-f966-42c6-858d-7392b2902c0d","Type":"ContainerDied","Data":"3ef4590d6d6e31d6fd3ceb975358a7354ef07d1f881d3cc4fdbefeaa13cc7b2d"} Mar 12 12:49:00.964789 master-0 kubenswrapper[13984]: I0312 12:49:00.964077 13984 scope.go:117] "RemoveContainer" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" Mar 12 12:49:00.966429 master-0 kubenswrapper[13984]: I0312 12:49:00.966395 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458d63ec-daa2-41a3-903e-08d1a10c29df","Type":"ContainerStarted","Data":"259b967a0dfa1a314b3fc2ba53759034dc95967bba03e380e04dbfab68d647da"} Mar 12 12:49:00.966429 master-0 kubenswrapper[13984]: I0312 12:49:00.966421 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458d63ec-daa2-41a3-903e-08d1a10c29df","Type":"ContainerStarted","Data":"7b112264da30bd63bfb1639c2b115eb3d823493c7fcb4312761960a23e40f766"} Mar 12 12:49:01.000428 master-0 kubenswrapper[13984]: I0312 12:49:01.000321 13984 scope.go:117] "RemoveContainer" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" Mar 12 12:49:01.005654 master-0 kubenswrapper[13984]: E0312 12:49:01.005610 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5\": container with ID starting with 62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5 not found: ID does not exist" containerID="62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5" Mar 12 12:49:01.005781 master-0 kubenswrapper[13984]: I0312 12:49:01.005651 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5"} err="failed to get container status \"62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5\": rpc error: code = NotFound desc = could not find container \"62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5\": container with ID starting with 62a1c154386fbd59c88cca84009215bdebb32227c6d0c9bbd455221bed9a3ad5 not found: ID does not exist" Mar 12 12:49:01.227502 master-0 kubenswrapper[13984]: I0312 12:49:01.222650 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:01.329294 master-0 kubenswrapper[13984]: I0312 12:49:01.328137 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 12:49:01.329294 master-0 kubenswrapper[13984]: I0312 12:49:01.329262 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 12:49:01.348499 master-0 kubenswrapper[13984]: I0312 12:49:01.344972 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:01.387546 master-0 kubenswrapper[13984]: I0312 12:49:01.385240 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:01.387546 master-0 kubenswrapper[13984]: E0312 12:49:01.385773 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be2c270c-f966-42c6-858d-7392b2902c0d" containerName="nova-scheduler-scheduler" Mar 12 12:49:01.387546 master-0 kubenswrapper[13984]: I0312 12:49:01.385787 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="be2c270c-f966-42c6-858d-7392b2902c0d" containerName="nova-scheduler-scheduler" Mar 12 12:49:01.387546 master-0 kubenswrapper[13984]: I0312 12:49:01.385998 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="be2c270c-f966-42c6-858d-7392b2902c0d" containerName="nova-scheduler-scheduler" Mar 12 12:49:01.387546 master-0 kubenswrapper[13984]: I0312 12:49:01.386722 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:01.392354 master-0 kubenswrapper[13984]: I0312 12:49:01.392215 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 12:49:01.407464 master-0 kubenswrapper[13984]: I0312 12:49:01.407414 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:01.470992 master-0 kubenswrapper[13984]: I0312 12:49:01.470938 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzln\" (UniqueName: \"kubernetes.io/projected/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-kube-api-access-sdzln\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.471194 master-0 kubenswrapper[13984]: I0312 12:49:01.471034 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-config-data\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.471194 master-0 kubenswrapper[13984]: I0312 12:49:01.471105 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.574140 master-0 kubenswrapper[13984]: I0312 12:49:01.573943 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-config-data\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.574140 master-0 kubenswrapper[13984]: I0312 12:49:01.574112 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.577284 master-0 kubenswrapper[13984]: I0312 12:49:01.574355 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzln\" (UniqueName: \"kubernetes.io/projected/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-kube-api-access-sdzln\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.579444 master-0 kubenswrapper[13984]: I0312 12:49:01.579271 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-config-data\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.585289 master-0 kubenswrapper[13984]: I0312 12:49:01.585251 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.593876 master-0 kubenswrapper[13984]: I0312 12:49:01.593819 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzln\" (UniqueName: \"kubernetes.io/projected/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-kube-api-access-sdzln\") pod \"nova-scheduler-0\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:01.768941 master-0 kubenswrapper[13984]: I0312 12:49:01.768877 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:02.010644 master-0 kubenswrapper[13984]: I0312 12:49:02.010450 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be2c270c-f966-42c6-858d-7392b2902c0d" path="/var/lib/kubelet/pods/be2c270c-f966-42c6-858d-7392b2902c0d/volumes" Mar 12 12:49:02.012266 master-0 kubenswrapper[13984]: I0312 12:49:02.012168 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458d63ec-daa2-41a3-903e-08d1a10c29df","Type":"ContainerStarted","Data":"e95df6d22c5135174dc12529ece3e7aa8a10239725074fd1c0f193a47a459627"} Mar 12 12:49:02.012266 master-0 kubenswrapper[13984]: I0312 12:49:02.012235 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"67f9b44e-10c5-46d9-8091-006ca18b30ac","Type":"ContainerStarted","Data":"961e071423d83417af4bbc249077ef27779d254a6a5673ab5ff0261a8f2c13ce"} Mar 12 12:49:02.012458 master-0 kubenswrapper[13984]: I0312 12:49:02.012348 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 12 12:49:02.016508 master-0 kubenswrapper[13984]: I0312 12:49:02.016436 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"43e6fb75-b813-4074-8029-d6817b1bb9e2","Type":"ContainerStarted","Data":"9fc0efd8b661b7c9fd07b6fbb73d6d33f9f10e26a87d9f2a80b8561d93b509bf"} Mar 12 12:49:02.020142 master-0 kubenswrapper[13984]: I0312 12:49:02.020090 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 12 12:49:02.023351 master-0 kubenswrapper[13984]: I0312 12:49:02.023308 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 12 12:49:02.057738 master-0 kubenswrapper[13984]: I0312 12:49:02.057666 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.057646539 podStartE2EDuration="3.057646539s" podCreationTimestamp="2026-03-12 12:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:02.052182029 +0000 UTC m=+1474.250197531" watchObservedRunningTime="2026-03-12 12:49:02.057646539 +0000 UTC m=+1474.255662031" Mar 12 12:49:02.110205 master-0 kubenswrapper[13984]: I0312 12:49:02.110050 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.110011383 podStartE2EDuration="3.110011383s" podCreationTimestamp="2026-03-12 12:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:02.078977743 +0000 UTC m=+1474.276993235" watchObservedRunningTime="2026-03-12 12:49:02.110011383 +0000 UTC m=+1474.308026875" Mar 12 12:49:02.157783 master-0 kubenswrapper[13984]: I0312 12:49:02.157491 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=66.367838205 podStartE2EDuration="1m53.157449553s" podCreationTimestamp="2026-03-12 12:47:09 +0000 UTC" firstStartedPulling="2026-03-12 12:47:21.856370756 +0000 UTC m=+1374.054386248" lastFinishedPulling="2026-03-12 12:48:08.645982084 +0000 UTC m=+1420.843997596" observedRunningTime="2026-03-12 12:49:02.138977207 +0000 UTC m=+1474.336992709" watchObservedRunningTime="2026-03-12 12:49:02.157449553 +0000 UTC m=+1474.355465045" Mar 12 12:49:02.336113 master-0 kubenswrapper[13984]: I0312 12:49:02.333148 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:03.029700 master-0 kubenswrapper[13984]: I0312 12:49:03.029646 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e","Type":"ContainerStarted","Data":"63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5"} Mar 12 12:49:03.029700 master-0 kubenswrapper[13984]: I0312 12:49:03.029705 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e","Type":"ContainerStarted","Data":"e4103d2c06521d28a7b327bb86567f40a041181b4dfa60c43c0095d49872b955"} Mar 12 12:49:03.865348 master-0 kubenswrapper[13984]: I0312 12:49:03.865185 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 12 12:49:04.101435 master-0 kubenswrapper[13984]: I0312 12:49:04.101365 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 12 12:49:04.152499 master-0 kubenswrapper[13984]: I0312 12:49:04.152307 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.152268302 podStartE2EDuration="3.152268302s" podCreationTimestamp="2026-03-12 12:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:03.057397972 +0000 UTC m=+1475.255413464" watchObservedRunningTime="2026-03-12 12:49:04.152268302 +0000 UTC m=+1476.350283794" Mar 12 12:49:05.073959 master-0 kubenswrapper[13984]: I0312 12:49:05.073902 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 12 12:49:05.463058 master-0 kubenswrapper[13984]: I0312 12:49:05.462996 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 12 12:49:06.329146 master-0 kubenswrapper[13984]: I0312 12:49:06.329067 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 12:49:06.330003 master-0 kubenswrapper[13984]: I0312 12:49:06.329972 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 12:49:06.769280 master-0 kubenswrapper[13984]: I0312 12:49:06.769200 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 12:49:07.341776 master-0 kubenswrapper[13984]: I0312 12:49:07.341716 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.0.251:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:07.341972 master-0 kubenswrapper[13984]: I0312 12:49:07.341740 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.0.251:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:09.586096 master-0 kubenswrapper[13984]: I0312 12:49:09.585980 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 12:49:09.586096 master-0 kubenswrapper[13984]: I0312 12:49:09.586109 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 12:49:09.668075 master-0 kubenswrapper[13984]: I0312 12:49:09.667992 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 12 12:49:10.668870 master-0 kubenswrapper[13984]: I0312 12:49:10.668775 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.0.252:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:10.669433 master-0 kubenswrapper[13984]: I0312 12:49:10.668941 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.0.252:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:11.769800 master-0 kubenswrapper[13984]: I0312 12:49:11.769730 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 12:49:11.814186 master-0 kubenswrapper[13984]: I0312 12:49:11.814136 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 12:49:12.181462 master-0 kubenswrapper[13984]: I0312 12:49:12.181356 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 12:49:13.036333 master-0 kubenswrapper[13984]: I0312 12:49:13.036257 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.188104 master-0 kubenswrapper[13984]: I0312 12:49:13.187928 13984 generic.go:334] "Generic (PLEG): container finished" podID="295668ef-83e3-408c-a447-34e6a52cf61e" containerID="6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958" exitCode=137 Mar 12 12:49:13.188104 master-0 kubenswrapper[13984]: I0312 12:49:13.187986 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.188104 master-0 kubenswrapper[13984]: I0312 12:49:13.188036 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"295668ef-83e3-408c-a447-34e6a52cf61e","Type":"ContainerDied","Data":"6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958"} Mar 12 12:49:13.188400 master-0 kubenswrapper[13984]: I0312 12:49:13.188107 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"295668ef-83e3-408c-a447-34e6a52cf61e","Type":"ContainerDied","Data":"94bdbea22967a8d8ce6d69d97fa63569973685ab248ef46772141dee0448d3f6"} Mar 12 12:49:13.188400 master-0 kubenswrapper[13984]: I0312 12:49:13.188138 13984 scope.go:117] "RemoveContainer" containerID="6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958" Mar 12 12:49:13.202138 master-0 kubenswrapper[13984]: I0312 12:49:13.202087 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srqg\" (UniqueName: \"kubernetes.io/projected/295668ef-83e3-408c-a447-34e6a52cf61e-kube-api-access-7srqg\") pod \"295668ef-83e3-408c-a447-34e6a52cf61e\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " Mar 12 12:49:13.202262 master-0 kubenswrapper[13984]: I0312 12:49:13.202152 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-config-data\") pod \"295668ef-83e3-408c-a447-34e6a52cf61e\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " Mar 12 12:49:13.202439 master-0 kubenswrapper[13984]: I0312 12:49:13.202409 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-combined-ca-bundle\") pod \"295668ef-83e3-408c-a447-34e6a52cf61e\" (UID: \"295668ef-83e3-408c-a447-34e6a52cf61e\") " Mar 12 12:49:13.211808 master-0 kubenswrapper[13984]: I0312 12:49:13.211764 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/295668ef-83e3-408c-a447-34e6a52cf61e-kube-api-access-7srqg" (OuterVolumeSpecName: "kube-api-access-7srqg") pod "295668ef-83e3-408c-a447-34e6a52cf61e" (UID: "295668ef-83e3-408c-a447-34e6a52cf61e"). InnerVolumeSpecName "kube-api-access-7srqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:13.244873 master-0 kubenswrapper[13984]: I0312 12:49:13.244747 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-config-data" (OuterVolumeSpecName: "config-data") pod "295668ef-83e3-408c-a447-34e6a52cf61e" (UID: "295668ef-83e3-408c-a447-34e6a52cf61e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:13.253462 master-0 kubenswrapper[13984]: I0312 12:49:13.253355 13984 scope.go:117] "RemoveContainer" containerID="6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958" Mar 12 12:49:13.254592 master-0 kubenswrapper[13984]: E0312 12:49:13.254449 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958\": container with ID starting with 6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958 not found: ID does not exist" containerID="6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958" Mar 12 12:49:13.254592 master-0 kubenswrapper[13984]: I0312 12:49:13.254509 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958"} err="failed to get container status \"6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958\": rpc error: code = NotFound desc = could not find container \"6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958\": container with ID starting with 6a4be06f9d985eee6fcd8a72c6d262a9271060401050cfc4c87b6f11c611a958 not found: ID does not exist" Mar 12 12:49:13.255593 master-0 kubenswrapper[13984]: I0312 12:49:13.255541 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "295668ef-83e3-408c-a447-34e6a52cf61e" (UID: "295668ef-83e3-408c-a447-34e6a52cf61e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:13.307262 master-0 kubenswrapper[13984]: I0312 12:49:13.307140 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:13.307555 master-0 kubenswrapper[13984]: I0312 12:49:13.307542 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srqg\" (UniqueName: \"kubernetes.io/projected/295668ef-83e3-408c-a447-34e6a52cf61e-kube-api-access-7srqg\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:13.307724 master-0 kubenswrapper[13984]: I0312 12:49:13.307710 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/295668ef-83e3-408c-a447-34e6a52cf61e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:13.534445 master-0 kubenswrapper[13984]: I0312 12:49:13.534369 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:49:13.547166 master-0 kubenswrapper[13984]: I0312 12:49:13.547099 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:49:13.584375 master-0 kubenswrapper[13984]: I0312 12:49:13.583755 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:49:13.584708 master-0 kubenswrapper[13984]: E0312 12:49:13.584668 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="295668ef-83e3-408c-a447-34e6a52cf61e" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 12:49:13.584769 master-0 kubenswrapper[13984]: I0312 12:49:13.584708 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="295668ef-83e3-408c-a447-34e6a52cf61e" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 12:49:13.585232 master-0 kubenswrapper[13984]: I0312 12:49:13.585197 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="295668ef-83e3-408c-a447-34e6a52cf61e" containerName="nova-cell1-novncproxy-novncproxy" Mar 12 12:49:13.586750 master-0 kubenswrapper[13984]: I0312 12:49:13.586715 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.591926 master-0 kubenswrapper[13984]: I0312 12:49:13.591883 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 12 12:49:13.592436 master-0 kubenswrapper[13984]: I0312 12:49:13.592418 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 12 12:49:13.592894 master-0 kubenswrapper[13984]: I0312 12:49:13.592836 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 12 12:49:13.609036 master-0 kubenswrapper[13984]: I0312 12:49:13.608956 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:49:13.718619 master-0 kubenswrapper[13984]: I0312 12:49:13.718547 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.719151 master-0 kubenswrapper[13984]: I0312 12:49:13.719132 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.719262 master-0 kubenswrapper[13984]: I0312 12:49:13.719247 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.719974 master-0 kubenswrapper[13984]: I0312 12:49:13.719696 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skmxp\" (UniqueName: \"kubernetes.io/projected/6b2ee394-60ab-4e16-8d45-7992a2ba039d-kube-api-access-skmxp\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.720251 master-0 kubenswrapper[13984]: I0312 12:49:13.720170 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.822965 master-0 kubenswrapper[13984]: I0312 12:49:13.822786 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.822965 master-0 kubenswrapper[13984]: I0312 12:49:13.822908 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.823201 master-0 kubenswrapper[13984]: I0312 12:49:13.822985 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.823201 master-0 kubenswrapper[13984]: I0312 12:49:13.823014 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.823201 master-0 kubenswrapper[13984]: I0312 12:49:13.823089 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skmxp\" (UniqueName: \"kubernetes.io/projected/6b2ee394-60ab-4e16-8d45-7992a2ba039d-kube-api-access-skmxp\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.828089 master-0 kubenswrapper[13984]: I0312 12:49:13.827936 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.830371 master-0 kubenswrapper[13984]: I0312 12:49:13.830335 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.832609 master-0 kubenswrapper[13984]: I0312 12:49:13.832571 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.839663 master-0 kubenswrapper[13984]: I0312 12:49:13.839627 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b2ee394-60ab-4e16-8d45-7992a2ba039d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.851204 master-0 kubenswrapper[13984]: I0312 12:49:13.851142 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skmxp\" (UniqueName: \"kubernetes.io/projected/6b2ee394-60ab-4e16-8d45-7992a2ba039d-kube-api-access-skmxp\") pod \"nova-cell1-novncproxy-0\" (UID: \"6b2ee394-60ab-4e16-8d45-7992a2ba039d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.931686 master-0 kubenswrapper[13984]: I0312 12:49:13.931616 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:13.996230 master-0 kubenswrapper[13984]: I0312 12:49:13.996181 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="295668ef-83e3-408c-a447-34e6a52cf61e" path="/var/lib/kubelet/pods/295668ef-83e3-408c-a447-34e6a52cf61e/volumes" Mar 12 12:49:14.672434 master-0 kubenswrapper[13984]: W0312 12:49:14.672318 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b2ee394_60ab_4e16_8d45_7992a2ba039d.slice/crio-a80d7ec98c76537c69e573b1cb759fb7c51df02c2aa6d7bccce88ccceefc700f WatchSource:0}: Error finding container a80d7ec98c76537c69e573b1cb759fb7c51df02c2aa6d7bccce88ccceefc700f: Status 404 returned error can't find the container with id a80d7ec98c76537c69e573b1cb759fb7c51df02c2aa6d7bccce88ccceefc700f Mar 12 12:49:14.696021 master-0 kubenswrapper[13984]: I0312 12:49:14.694791 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 12 12:49:15.226374 master-0 kubenswrapper[13984]: I0312 12:49:15.225475 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b2ee394-60ab-4e16-8d45-7992a2ba039d","Type":"ContainerStarted","Data":"d3c7305a108833b3fb46a920ecf303bfd3b8c0476d5b455b880ec3801d060bf2"} Mar 12 12:49:15.226374 master-0 kubenswrapper[13984]: I0312 12:49:15.225539 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"6b2ee394-60ab-4e16-8d45-7992a2ba039d","Type":"ContainerStarted","Data":"a80d7ec98c76537c69e573b1cb759fb7c51df02c2aa6d7bccce88ccceefc700f"} Mar 12 12:49:15.251403 master-0 kubenswrapper[13984]: I0312 12:49:15.251272 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.251230865 podStartE2EDuration="2.251230865s" podCreationTimestamp="2026-03-12 12:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:15.249899548 +0000 UTC m=+1487.447915040" watchObservedRunningTime="2026-03-12 12:49:15.251230865 +0000 UTC m=+1487.449246357" Mar 12 12:49:16.336142 master-0 kubenswrapper[13984]: I0312 12:49:16.336074 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 12:49:16.341139 master-0 kubenswrapper[13984]: I0312 12:49:16.341083 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 12:49:16.345364 master-0 kubenswrapper[13984]: I0312 12:49:16.345322 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 12:49:17.260882 master-0 kubenswrapper[13984]: I0312 12:49:17.260802 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 12:49:18.932433 master-0 kubenswrapper[13984]: I0312 12:49:18.932359 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:19.590815 master-0 kubenswrapper[13984]: I0312 12:49:19.590407 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 12:49:19.591032 master-0 kubenswrapper[13984]: I0312 12:49:19.590950 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 12:49:19.593222 master-0 kubenswrapper[13984]: I0312 12:49:19.593142 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 12:49:19.597911 master-0 kubenswrapper[13984]: I0312 12:49:19.597842 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 12:49:20.285337 master-0 kubenswrapper[13984]: I0312 12:49:20.285271 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 12:49:20.288443 master-0 kubenswrapper[13984]: I0312 12:49:20.288384 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 12:49:20.595259 master-0 kubenswrapper[13984]: I0312 12:49:20.592628 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fd99d4b97-wqbt9"] Mar 12 12:49:20.595259 master-0 kubenswrapper[13984]: I0312 12:49:20.595079 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.603614 master-0 kubenswrapper[13984]: I0312 12:49:20.602069 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd99d4b97-wqbt9"] Mar 12 12:49:20.733571 master-0 kubenswrapper[13984]: I0312 12:49:20.731937 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smxgd\" (UniqueName: \"kubernetes.io/projected/722fad77-eaea-4605-b6a6-58ece6b90875-kube-api-access-smxgd\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.733571 master-0 kubenswrapper[13984]: I0312 12:49:20.731999 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-config\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.733571 master-0 kubenswrapper[13984]: I0312 12:49:20.732071 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.733571 master-0 kubenswrapper[13984]: I0312 12:49:20.732122 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-dns-svc\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.733571 master-0 kubenswrapper[13984]: I0312 12:49:20.732221 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.733571 master-0 kubenswrapper[13984]: I0312 12:49:20.732250 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.834625 master-0 kubenswrapper[13984]: I0312 12:49:20.834453 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-config\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.834818 master-0 kubenswrapper[13984]: I0312 12:49:20.834657 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.834988 master-0 kubenswrapper[13984]: I0312 12:49:20.834952 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-dns-svc\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.835154 master-0 kubenswrapper[13984]: I0312 12:49:20.835125 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.835200 master-0 kubenswrapper[13984]: I0312 12:49:20.835178 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.835235 master-0 kubenswrapper[13984]: I0312 12:49:20.835223 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smxgd\" (UniqueName: \"kubernetes.io/projected/722fad77-eaea-4605-b6a6-58ece6b90875-kube-api-access-smxgd\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.835507 master-0 kubenswrapper[13984]: I0312 12:49:20.835458 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-config\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.835641 master-0 kubenswrapper[13984]: I0312 12:49:20.835605 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-dns-swift-storage-0\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.836515 master-0 kubenswrapper[13984]: I0312 12:49:20.836465 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-dns-svc\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.836902 master-0 kubenswrapper[13984]: I0312 12:49:20.836850 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-ovsdbserver-nb\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.837283 master-0 kubenswrapper[13984]: I0312 12:49:20.837194 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/722fad77-eaea-4605-b6a6-58ece6b90875-ovsdbserver-sb\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.863603 master-0 kubenswrapper[13984]: I0312 12:49:20.863557 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smxgd\" (UniqueName: \"kubernetes.io/projected/722fad77-eaea-4605-b6a6-58ece6b90875-kube-api-access-smxgd\") pod \"dnsmasq-dns-6fd99d4b97-wqbt9\" (UID: \"722fad77-eaea-4605-b6a6-58ece6b90875\") " pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:20.935025 master-0 kubenswrapper[13984]: I0312 12:49:20.934965 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:21.510460 master-0 kubenswrapper[13984]: I0312 12:49:21.510425 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fd99d4b97-wqbt9"] Mar 12 12:49:22.312974 master-0 kubenswrapper[13984]: I0312 12:49:22.312909 13984 generic.go:334] "Generic (PLEG): container finished" podID="722fad77-eaea-4605-b6a6-58ece6b90875" containerID="4a521cc242e30bd989c911b566b8fda98c654db5cf1152837cb3ca229198b994" exitCode=0 Mar 12 12:49:22.314376 master-0 kubenswrapper[13984]: I0312 12:49:22.314164 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" event={"ID":"722fad77-eaea-4605-b6a6-58ece6b90875","Type":"ContainerDied","Data":"4a521cc242e30bd989c911b566b8fda98c654db5cf1152837cb3ca229198b994"} Mar 12 12:49:22.314376 master-0 kubenswrapper[13984]: I0312 12:49:22.314206 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" event={"ID":"722fad77-eaea-4605-b6a6-58ece6b90875","Type":"ContainerStarted","Data":"8d30570ea7d2dbd41765c5987a6fd69f0678854f1ba74abace5469f4db8cb403"} Mar 12 12:49:23.334557 master-0 kubenswrapper[13984]: I0312 12:49:23.333431 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" event={"ID":"722fad77-eaea-4605-b6a6-58ece6b90875","Type":"ContainerStarted","Data":"da70de44559d09691d6d6d53304a2e94a7c7b6e2581dc5274ec739cc5b05ca0a"} Mar 12 12:49:23.334557 master-0 kubenswrapper[13984]: I0312 12:49:23.333597 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:23.367371 master-0 kubenswrapper[13984]: I0312 12:49:23.366956 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" podStartSLOduration=3.366932977 podStartE2EDuration="3.366932977s" podCreationTimestamp="2026-03-12 12:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:23.354700762 +0000 UTC m=+1495.552716264" watchObservedRunningTime="2026-03-12 12:49:23.366932977 +0000 UTC m=+1495.564948469" Mar 12 12:49:23.565381 master-0 kubenswrapper[13984]: I0312 12:49:23.565316 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:23.570501 master-0 kubenswrapper[13984]: I0312 12:49:23.565625 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-log" containerID="cri-o://259b967a0dfa1a314b3fc2ba53759034dc95967bba03e380e04dbfab68d647da" gracePeriod=30 Mar 12 12:49:23.570501 master-0 kubenswrapper[13984]: I0312 12:49:23.566284 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-api" containerID="cri-o://e95df6d22c5135174dc12529ece3e7aa8a10239725074fd1c0f193a47a459627" gracePeriod=30 Mar 12 12:49:23.855382 master-0 kubenswrapper[13984]: E0312 12:49:23.855320 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod458d63ec_daa2_41a3_903e_08d1a10c29df.slice/crio-conmon-259b967a0dfa1a314b3fc2ba53759034dc95967bba03e380e04dbfab68d647da.scope\": RecentStats: unable to find data in memory cache]" Mar 12 12:49:23.932383 master-0 kubenswrapper[13984]: I0312 12:49:23.932234 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:23.953724 master-0 kubenswrapper[13984]: I0312 12:49:23.953655 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:24.353344 master-0 kubenswrapper[13984]: I0312 12:49:24.353286 13984 generic.go:334] "Generic (PLEG): container finished" podID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerID="259b967a0dfa1a314b3fc2ba53759034dc95967bba03e380e04dbfab68d647da" exitCode=143 Mar 12 12:49:24.353967 master-0 kubenswrapper[13984]: I0312 12:49:24.353449 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458d63ec-daa2-41a3-903e-08d1a10c29df","Type":"ContainerDied","Data":"259b967a0dfa1a314b3fc2ba53759034dc95967bba03e380e04dbfab68d647da"} Mar 12 12:49:24.377800 master-0 kubenswrapper[13984]: I0312 12:49:24.377586 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 12 12:49:24.620499 master-0 kubenswrapper[13984]: I0312 12:49:24.609261 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-52nsz"] Mar 12 12:49:24.620499 master-0 kubenswrapper[13984]: I0312 12:49:24.615733 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.632002 master-0 kubenswrapper[13984]: I0312 12:49:24.631955 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 12 12:49:24.632211 master-0 kubenswrapper[13984]: I0312 12:49:24.632157 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 12 12:49:24.633234 master-0 kubenswrapper[13984]: I0312 12:49:24.633197 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-52nsz"] Mar 12 12:49:24.678284 master-0 kubenswrapper[13984]: I0312 12:49:24.678228 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-j4k72"] Mar 12 12:49:24.701791 master-0 kubenswrapper[13984]: I0312 12:49:24.701737 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.707142 master-0 kubenswrapper[13984]: I0312 12:49:24.707032 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-j4k72"] Mar 12 12:49:24.720696 master-0 kubenswrapper[13984]: I0312 12:49:24.719857 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8n4j\" (UniqueName: \"kubernetes.io/projected/966ac026-07e7-4214-9c35-942fcd250899-kube-api-access-b8n4j\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.720696 master-0 kubenswrapper[13984]: I0312 12:49:24.720006 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-scripts\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.720696 master-0 kubenswrapper[13984]: I0312 12:49:24.720052 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.720696 master-0 kubenswrapper[13984]: I0312 12:49:24.720185 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-config-data\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.821894 master-0 kubenswrapper[13984]: I0312 12:49:24.821823 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-combined-ca-bundle\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.822140 master-0 kubenswrapper[13984]: I0312 12:49:24.821929 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8n4j\" (UniqueName: \"kubernetes.io/projected/966ac026-07e7-4214-9c35-942fcd250899-kube-api-access-b8n4j\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.822140 master-0 kubenswrapper[13984]: I0312 12:49:24.822067 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-scripts\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.822140 master-0 kubenswrapper[13984]: I0312 12:49:24.822107 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-config-data\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.822737 master-0 kubenswrapper[13984]: I0312 12:49:24.822144 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.822737 master-0 kubenswrapper[13984]: I0312 12:49:24.822414 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-scripts\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.822737 master-0 kubenswrapper[13984]: I0312 12:49:24.822472 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2g2w\" (UniqueName: \"kubernetes.io/projected/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-kube-api-access-k2g2w\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.822737 master-0 kubenswrapper[13984]: I0312 12:49:24.822524 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-config-data\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.838558 master-0 kubenswrapper[13984]: I0312 12:49:24.825747 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.838558 master-0 kubenswrapper[13984]: I0312 12:49:24.826262 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-scripts\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.841583 master-0 kubenswrapper[13984]: I0312 12:49:24.840565 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-config-data\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.843834 master-0 kubenswrapper[13984]: I0312 12:49:24.843792 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8n4j\" (UniqueName: \"kubernetes.io/projected/966ac026-07e7-4214-9c35-942fcd250899-kube-api-access-b8n4j\") pod \"nova-cell1-cell-mapping-52nsz\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:24.924950 master-0 kubenswrapper[13984]: I0312 12:49:24.924816 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-config-data\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.925148 master-0 kubenswrapper[13984]: I0312 12:49:24.924968 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-scripts\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.925148 master-0 kubenswrapper[13984]: I0312 12:49:24.925048 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2g2w\" (UniqueName: \"kubernetes.io/projected/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-kube-api-access-k2g2w\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.925148 master-0 kubenswrapper[13984]: I0312 12:49:24.925136 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-combined-ca-bundle\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.928941 master-0 kubenswrapper[13984]: I0312 12:49:24.928908 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-config-data\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.929150 master-0 kubenswrapper[13984]: I0312 12:49:24.928943 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-scripts\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.934575 master-0 kubenswrapper[13984]: I0312 12:49:24.931333 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-combined-ca-bundle\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.952215 master-0 kubenswrapper[13984]: I0312 12:49:24.951956 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2g2w\" (UniqueName: \"kubernetes.io/projected/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-kube-api-access-k2g2w\") pod \"nova-cell1-host-discover-j4k72\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:24.965511 master-0 kubenswrapper[13984]: I0312 12:49:24.965420 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:25.032872 master-0 kubenswrapper[13984]: I0312 12:49:25.032790 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:25.470844 master-0 kubenswrapper[13984]: I0312 12:49:25.470697 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-52nsz"] Mar 12 12:49:25.639575 master-0 kubenswrapper[13984]: I0312 12:49:25.639150 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-j4k72"] Mar 12 12:49:26.387392 master-0 kubenswrapper[13984]: I0312 12:49:26.387333 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-52nsz" event={"ID":"966ac026-07e7-4214-9c35-942fcd250899","Type":"ContainerStarted","Data":"59a3f7872c850a39d565e93622e7b1c35544876d5092624ee082902d3972bf0c"} Mar 12 12:49:26.387392 master-0 kubenswrapper[13984]: I0312 12:49:26.387392 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-52nsz" event={"ID":"966ac026-07e7-4214-9c35-942fcd250899","Type":"ContainerStarted","Data":"9761364bff8e9466d3da760d37ea141e3fd0be8c50b1c732556cc8609796f0de"} Mar 12 12:49:26.392584 master-0 kubenswrapper[13984]: I0312 12:49:26.392523 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j4k72" event={"ID":"bc21a66e-84d4-4147-a050-e03b4cfc5ddd","Type":"ContainerStarted","Data":"d26c70f253b5743c8629cececd0103a066cbf26e8be267e46d5a5419b45ef5cb"} Mar 12 12:49:26.392584 master-0 kubenswrapper[13984]: I0312 12:49:26.392559 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j4k72" event={"ID":"bc21a66e-84d4-4147-a050-e03b4cfc5ddd","Type":"ContainerStarted","Data":"31048ccd9e6470d286ee6d96b6b56a582e3c28fe8473a769b3e69573aa3e8833"} Mar 12 12:49:26.407795 master-0 kubenswrapper[13984]: I0312 12:49:26.407634 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-52nsz" podStartSLOduration=2.407595322 podStartE2EDuration="2.407595322s" podCreationTimestamp="2026-03-12 12:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:26.403549001 +0000 UTC m=+1498.601564493" watchObservedRunningTime="2026-03-12 12:49:26.407595322 +0000 UTC m=+1498.605610814" Mar 12 12:49:26.445978 master-0 kubenswrapper[13984]: I0312 12:49:26.444938 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-j4k72" podStartSLOduration=2.4449152339999998 podStartE2EDuration="2.444915234s" podCreationTimestamp="2026-03-12 12:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:26.421334678 +0000 UTC m=+1498.619350160" watchObservedRunningTime="2026-03-12 12:49:26.444915234 +0000 UTC m=+1498.642930726" Mar 12 12:49:27.409895 master-0 kubenswrapper[13984]: I0312 12:49:27.409837 13984 generic.go:334] "Generic (PLEG): container finished" podID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerID="e95df6d22c5135174dc12529ece3e7aa8a10239725074fd1c0f193a47a459627" exitCode=0 Mar 12 12:49:27.410670 master-0 kubenswrapper[13984]: I0312 12:49:27.410023 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458d63ec-daa2-41a3-903e-08d1a10c29df","Type":"ContainerDied","Data":"e95df6d22c5135174dc12529ece3e7aa8a10239725074fd1c0f193a47a459627"} Mar 12 12:49:27.410670 master-0 kubenswrapper[13984]: I0312 12:49:27.410075 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458d63ec-daa2-41a3-903e-08d1a10c29df","Type":"ContainerDied","Data":"7b112264da30bd63bfb1639c2b115eb3d823493c7fcb4312761960a23e40f766"} Mar 12 12:49:27.410670 master-0 kubenswrapper[13984]: I0312 12:49:27.410091 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b112264da30bd63bfb1639c2b115eb3d823493c7fcb4312761960a23e40f766" Mar 12 12:49:27.424798 master-0 kubenswrapper[13984]: I0312 12:49:27.424741 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:27.566615 master-0 kubenswrapper[13984]: I0312 12:49:27.561687 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-combined-ca-bundle\") pod \"458d63ec-daa2-41a3-903e-08d1a10c29df\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " Mar 12 12:49:27.566615 master-0 kubenswrapper[13984]: I0312 12:49:27.561802 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458d63ec-daa2-41a3-903e-08d1a10c29df-logs\") pod \"458d63ec-daa2-41a3-903e-08d1a10c29df\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " Mar 12 12:49:27.566615 master-0 kubenswrapper[13984]: I0312 12:49:27.561849 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-config-data\") pod \"458d63ec-daa2-41a3-903e-08d1a10c29df\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " Mar 12 12:49:27.566615 master-0 kubenswrapper[13984]: I0312 12:49:27.561982 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7d8x4\" (UniqueName: \"kubernetes.io/projected/458d63ec-daa2-41a3-903e-08d1a10c29df-kube-api-access-7d8x4\") pod \"458d63ec-daa2-41a3-903e-08d1a10c29df\" (UID: \"458d63ec-daa2-41a3-903e-08d1a10c29df\") " Mar 12 12:49:27.577817 master-0 kubenswrapper[13984]: I0312 12:49:27.577759 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458d63ec-daa2-41a3-903e-08d1a10c29df-logs" (OuterVolumeSpecName: "logs") pod "458d63ec-daa2-41a3-903e-08d1a10c29df" (UID: "458d63ec-daa2-41a3-903e-08d1a10c29df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:49:27.591973 master-0 kubenswrapper[13984]: I0312 12:49:27.591819 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458d63ec-daa2-41a3-903e-08d1a10c29df-kube-api-access-7d8x4" (OuterVolumeSpecName: "kube-api-access-7d8x4") pod "458d63ec-daa2-41a3-903e-08d1a10c29df" (UID: "458d63ec-daa2-41a3-903e-08d1a10c29df"). InnerVolumeSpecName "kube-api-access-7d8x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:27.621261 master-0 kubenswrapper[13984]: I0312 12:49:27.621205 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "458d63ec-daa2-41a3-903e-08d1a10c29df" (UID: "458d63ec-daa2-41a3-903e-08d1a10c29df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:27.624204 master-0 kubenswrapper[13984]: I0312 12:49:27.624144 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-config-data" (OuterVolumeSpecName: "config-data") pod "458d63ec-daa2-41a3-903e-08d1a10c29df" (UID: "458d63ec-daa2-41a3-903e-08d1a10c29df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:27.665488 master-0 kubenswrapper[13984]: I0312 12:49:27.665373 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:27.665488 master-0 kubenswrapper[13984]: I0312 12:49:27.665421 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458d63ec-daa2-41a3-903e-08d1a10c29df-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:27.665488 master-0 kubenswrapper[13984]: I0312 12:49:27.665434 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458d63ec-daa2-41a3-903e-08d1a10c29df-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:27.665488 master-0 kubenswrapper[13984]: I0312 12:49:27.665447 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7d8x4\" (UniqueName: \"kubernetes.io/projected/458d63ec-daa2-41a3-903e-08d1a10c29df-kube-api-access-7d8x4\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:28.420370 master-0 kubenswrapper[13984]: I0312 12:49:28.420329 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:28.549192 master-0 kubenswrapper[13984]: I0312 12:49:28.549132 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:28.575503 master-0 kubenswrapper[13984]: I0312 12:49:28.572537 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:28.593092 master-0 kubenswrapper[13984]: I0312 12:49:28.593036 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:28.593676 master-0 kubenswrapper[13984]: E0312 12:49:28.593646 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-log" Mar 12 12:49:28.593676 master-0 kubenswrapper[13984]: I0312 12:49:28.593672 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-log" Mar 12 12:49:28.593774 master-0 kubenswrapper[13984]: E0312 12:49:28.593689 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-api" Mar 12 12:49:28.593774 master-0 kubenswrapper[13984]: I0312 12:49:28.593699 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-api" Mar 12 12:49:28.594024 master-0 kubenswrapper[13984]: I0312 12:49:28.593985 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-api" Mar 12 12:49:28.594066 master-0 kubenswrapper[13984]: I0312 12:49:28.594034 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" containerName="nova-api-log" Mar 12 12:49:28.595328 master-0 kubenswrapper[13984]: I0312 12:49:28.595289 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:28.598563 master-0 kubenswrapper[13984]: I0312 12:49:28.598430 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 12:49:28.598704 master-0 kubenswrapper[13984]: I0312 12:49:28.598607 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 12:49:28.598704 master-0 kubenswrapper[13984]: I0312 12:49:28.598463 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 12:49:28.629181 master-0 kubenswrapper[13984]: I0312 12:49:28.629137 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:28.800869 master-0 kubenswrapper[13984]: I0312 12:49:28.800725 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-logs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.800869 master-0 kubenswrapper[13984]: I0312 12:49:28.800781 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.801686 master-0 kubenswrapper[13984]: I0312 12:49:28.800883 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djrd8\" (UniqueName: \"kubernetes.io/projected/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-kube-api-access-djrd8\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.801686 master-0 kubenswrapper[13984]: I0312 12:49:28.801044 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.801686 master-0 kubenswrapper[13984]: I0312 12:49:28.801075 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-public-tls-certs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.801686 master-0 kubenswrapper[13984]: I0312 12:49:28.801119 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-config-data\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.903161 master-0 kubenswrapper[13984]: I0312 12:49:28.903117 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-config-data\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.903444 master-0 kubenswrapper[13984]: I0312 12:49:28.903428 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-logs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.903557 master-0 kubenswrapper[13984]: I0312 12:49:28.903543 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.903678 master-0 kubenswrapper[13984]: I0312 12:49:28.903664 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djrd8\" (UniqueName: \"kubernetes.io/projected/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-kube-api-access-djrd8\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.904255 master-0 kubenswrapper[13984]: I0312 12:49:28.904239 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.904370 master-0 kubenswrapper[13984]: I0312 12:49:28.904356 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-public-tls-certs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.904833 master-0 kubenswrapper[13984]: I0312 12:49:28.904089 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-logs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.906873 master-0 kubenswrapper[13984]: I0312 12:49:28.906834 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-config-data\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.907509 master-0 kubenswrapper[13984]: I0312 12:49:28.907464 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-public-tls-certs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.907657 master-0 kubenswrapper[13984]: I0312 12:49:28.907609 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.908522 master-0 kubenswrapper[13984]: I0312 12:49:28.907724 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-internal-tls-certs\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.922103 master-0 kubenswrapper[13984]: I0312 12:49:28.922043 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djrd8\" (UniqueName: \"kubernetes.io/projected/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-kube-api-access-djrd8\") pod \"nova-api-0\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " pod="openstack/nova-api-0" Mar 12 12:49:28.935623 master-0 kubenswrapper[13984]: I0312 12:49:28.934507 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:29.453890 master-0 kubenswrapper[13984]: I0312 12:49:29.451450 13984 generic.go:334] "Generic (PLEG): container finished" podID="bc21a66e-84d4-4147-a050-e03b4cfc5ddd" containerID="d26c70f253b5743c8629cececd0103a066cbf26e8be267e46d5a5419b45ef5cb" exitCode=0 Mar 12 12:49:29.453890 master-0 kubenswrapper[13984]: I0312 12:49:29.451524 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j4k72" event={"ID":"bc21a66e-84d4-4147-a050-e03b4cfc5ddd","Type":"ContainerDied","Data":"d26c70f253b5743c8629cececd0103a066cbf26e8be267e46d5a5419b45ef5cb"} Mar 12 12:49:29.462404 master-0 kubenswrapper[13984]: I0312 12:49:29.462137 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:29.996310 master-0 kubenswrapper[13984]: I0312 12:49:29.996158 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458d63ec-daa2-41a3-903e-08d1a10c29df" path="/var/lib/kubelet/pods/458d63ec-daa2-41a3-903e-08d1a10c29df/volumes" Mar 12 12:49:30.467308 master-0 kubenswrapper[13984]: I0312 12:49:30.467213 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbaa7f55-ae33-43a0-945c-0a6988ea56fa","Type":"ContainerStarted","Data":"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541"} Mar 12 12:49:30.467308 master-0 kubenswrapper[13984]: I0312 12:49:30.467257 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbaa7f55-ae33-43a0-945c-0a6988ea56fa","Type":"ContainerStarted","Data":"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa"} Mar 12 12:49:30.467308 master-0 kubenswrapper[13984]: I0312 12:49:30.467267 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbaa7f55-ae33-43a0-945c-0a6988ea56fa","Type":"ContainerStarted","Data":"3123ff0816b28abdbf68d97e18b7f1bc8abb9378a14c182af51a2046eeb775b6"} Mar 12 12:49:30.513546 master-0 kubenswrapper[13984]: I0312 12:49:30.513432 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.513408212 podStartE2EDuration="2.513408212s" podCreationTimestamp="2026-03-12 12:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:30.501740012 +0000 UTC m=+1502.699755544" watchObservedRunningTime="2026-03-12 12:49:30.513408212 +0000 UTC m=+1502.711423704" Mar 12 12:49:30.936484 master-0 kubenswrapper[13984]: I0312 12:49:30.936402 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6fd99d4b97-wqbt9" Mar 12 12:49:31.028124 master-0 kubenswrapper[13984]: I0312 12:49:31.027500 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fbc4c555-858hz"] Mar 12 12:49:31.028124 master-0 kubenswrapper[13984]: I0312 12:49:31.027831 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerName="dnsmasq-dns" containerID="cri-o://3a00fa6fc636cdb063a263fc33e09c250a731ec1bc0c4460258c49ff774d8c5e" gracePeriod=10 Mar 12 12:49:31.300626 master-0 kubenswrapper[13984]: I0312 12:49:31.300571 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:31.383661 master-0 kubenswrapper[13984]: I0312 12:49:31.383544 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-scripts\") pod \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " Mar 12 12:49:31.383661 master-0 kubenswrapper[13984]: I0312 12:49:31.383657 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-config-data\") pod \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " Mar 12 12:49:31.385111 master-0 kubenswrapper[13984]: I0312 12:49:31.383974 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2g2w\" (UniqueName: \"kubernetes.io/projected/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-kube-api-access-k2g2w\") pod \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " Mar 12 12:49:31.385111 master-0 kubenswrapper[13984]: I0312 12:49:31.384093 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-combined-ca-bundle\") pod \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\" (UID: \"bc21a66e-84d4-4147-a050-e03b4cfc5ddd\") " Mar 12 12:49:31.395665 master-0 kubenswrapper[13984]: I0312 12:49:31.395317 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-scripts" (OuterVolumeSpecName: "scripts") pod "bc21a66e-84d4-4147-a050-e03b4cfc5ddd" (UID: "bc21a66e-84d4-4147-a050-e03b4cfc5ddd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:31.409506 master-0 kubenswrapper[13984]: I0312 12:49:31.409329 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-kube-api-access-k2g2w" (OuterVolumeSpecName: "kube-api-access-k2g2w") pod "bc21a66e-84d4-4147-a050-e03b4cfc5ddd" (UID: "bc21a66e-84d4-4147-a050-e03b4cfc5ddd"). InnerVolumeSpecName "kube-api-access-k2g2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:31.425780 master-0 kubenswrapper[13984]: I0312 12:49:31.425714 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bc21a66e-84d4-4147-a050-e03b4cfc5ddd" (UID: "bc21a66e-84d4-4147-a050-e03b4cfc5ddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:31.432683 master-0 kubenswrapper[13984]: I0312 12:49:31.432555 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-config-data" (OuterVolumeSpecName: "config-data") pod "bc21a66e-84d4-4147-a050-e03b4cfc5ddd" (UID: "bc21a66e-84d4-4147-a050-e03b4cfc5ddd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:31.487358 master-0 kubenswrapper[13984]: I0312 12:49:31.487199 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.487358 master-0 kubenswrapper[13984]: I0312 12:49:31.487311 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.487358 master-0 kubenswrapper[13984]: I0312 12:49:31.487324 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2g2w\" (UniqueName: \"kubernetes.io/projected/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-kube-api-access-k2g2w\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.487358 master-0 kubenswrapper[13984]: I0312 12:49:31.487334 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bc21a66e-84d4-4147-a050-e03b4cfc5ddd-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.496013 master-0 kubenswrapper[13984]: I0312 12:49:31.495028 13984 generic.go:334] "Generic (PLEG): container finished" podID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerID="3a00fa6fc636cdb063a263fc33e09c250a731ec1bc0c4460258c49ff774d8c5e" exitCode=0 Mar 12 12:49:31.496013 master-0 kubenswrapper[13984]: I0312 12:49:31.495141 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" event={"ID":"cc025fea-4c26-4327-b33c-0f29566b6b50","Type":"ContainerDied","Data":"3a00fa6fc636cdb063a263fc33e09c250a731ec1bc0c4460258c49ff774d8c5e"} Mar 12 12:49:31.506938 master-0 kubenswrapper[13984]: I0312 12:49:31.503537 13984 generic.go:334] "Generic (PLEG): container finished" podID="966ac026-07e7-4214-9c35-942fcd250899" containerID="59a3f7872c850a39d565e93622e7b1c35544876d5092624ee082902d3972bf0c" exitCode=0 Mar 12 12:49:31.506938 master-0 kubenswrapper[13984]: I0312 12:49:31.503611 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-52nsz" event={"ID":"966ac026-07e7-4214-9c35-942fcd250899","Type":"ContainerDied","Data":"59a3f7872c850a39d565e93622e7b1c35544876d5092624ee082902d3972bf0c"} Mar 12 12:49:31.514101 master-0 kubenswrapper[13984]: I0312 12:49:31.511927 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-j4k72" Mar 12 12:49:31.514889 master-0 kubenswrapper[13984]: I0312 12:49:31.514843 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-j4k72" event={"ID":"bc21a66e-84d4-4147-a050-e03b4cfc5ddd","Type":"ContainerDied","Data":"31048ccd9e6470d286ee6d96b6b56a582e3c28fe8473a769b3e69573aa3e8833"} Mar 12 12:49:31.514961 master-0 kubenswrapper[13984]: I0312 12:49:31.514897 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31048ccd9e6470d286ee6d96b6b56a582e3c28fe8473a769b3e69573aa3e8833" Mar 12 12:49:31.665225 master-0 kubenswrapper[13984]: I0312 12:49:31.664456 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.794044 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-svc\") pod \"cc025fea-4c26-4327-b33c-0f29566b6b50\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.794156 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-sb\") pod \"cc025fea-4c26-4327-b33c-0f29566b6b50\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.794191 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-nb\") pod \"cc025fea-4c26-4327-b33c-0f29566b6b50\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.794252 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-config\") pod \"cc025fea-4c26-4327-b33c-0f29566b6b50\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.794308 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-swift-storage-0\") pod \"cc025fea-4c26-4327-b33c-0f29566b6b50\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.794495 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbc5w\" (UniqueName: \"kubernetes.io/projected/cc025fea-4c26-4327-b33c-0f29566b6b50-kube-api-access-nbc5w\") pod \"cc025fea-4c26-4327-b33c-0f29566b6b50\" (UID: \"cc025fea-4c26-4327-b33c-0f29566b6b50\") " Mar 12 12:49:31.803500 master-0 kubenswrapper[13984]: I0312 12:49:31.799887 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc025fea-4c26-4327-b33c-0f29566b6b50-kube-api-access-nbc5w" (OuterVolumeSpecName: "kube-api-access-nbc5w") pod "cc025fea-4c26-4327-b33c-0f29566b6b50" (UID: "cc025fea-4c26-4327-b33c-0f29566b6b50"). InnerVolumeSpecName "kube-api-access-nbc5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:31.878500 master-0 kubenswrapper[13984]: I0312 12:49:31.862151 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cc025fea-4c26-4327-b33c-0f29566b6b50" (UID: "cc025fea-4c26-4327-b33c-0f29566b6b50"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:49:31.878500 master-0 kubenswrapper[13984]: I0312 12:49:31.862211 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cc025fea-4c26-4327-b33c-0f29566b6b50" (UID: "cc025fea-4c26-4327-b33c-0f29566b6b50"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:49:31.878500 master-0 kubenswrapper[13984]: I0312 12:49:31.870922 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-config" (OuterVolumeSpecName: "config") pod "cc025fea-4c26-4327-b33c-0f29566b6b50" (UID: "cc025fea-4c26-4327-b33c-0f29566b6b50"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:49:31.878500 master-0 kubenswrapper[13984]: E0312 12:49:31.878220 13984 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc21a66e_84d4_4147_a050_e03b4cfc5ddd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc21a66e_84d4_4147_a050_e03b4cfc5ddd.slice/crio-31048ccd9e6470d286ee6d96b6b56a582e3c28fe8473a769b3e69573aa3e8833\": RecentStats: unable to find data in memory cache]" Mar 12 12:49:31.882628 master-0 kubenswrapper[13984]: I0312 12:49:31.881894 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cc025fea-4c26-4327-b33c-0f29566b6b50" (UID: "cc025fea-4c26-4327-b33c-0f29566b6b50"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:49:31.897445 master-0 kubenswrapper[13984]: I0312 12:49:31.897382 13984 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.897445 master-0 kubenswrapper[13984]: I0312 12:49:31.897427 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.897445 master-0 kubenswrapper[13984]: I0312 12:49:31.897440 13984 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.897445 master-0 kubenswrapper[13984]: I0312 12:49:31.897448 13984 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.897445 master-0 kubenswrapper[13984]: I0312 12:49:31.897457 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbc5w\" (UniqueName: \"kubernetes.io/projected/cc025fea-4c26-4327-b33c-0f29566b6b50-kube-api-access-nbc5w\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:31.902812 master-0 kubenswrapper[13984]: I0312 12:49:31.902406 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cc025fea-4c26-4327-b33c-0f29566b6b50" (UID: "cc025fea-4c26-4327-b33c-0f29566b6b50"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:49:31.999631 master-0 kubenswrapper[13984]: I0312 12:49:31.999559 13984 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cc025fea-4c26-4327-b33c-0f29566b6b50-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:32.524598 master-0 kubenswrapper[13984]: I0312 12:49:32.524439 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" event={"ID":"cc025fea-4c26-4327-b33c-0f29566b6b50","Type":"ContainerDied","Data":"72ca92dd755c16daf3b54552c75ecb61052592e0cae074b0e1275e4393b3ca06"} Mar 12 12:49:32.524598 master-0 kubenswrapper[13984]: I0312 12:49:32.524492 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64fbc4c555-858hz" Mar 12 12:49:32.524598 master-0 kubenswrapper[13984]: I0312 12:49:32.524532 13984 scope.go:117] "RemoveContainer" containerID="3a00fa6fc636cdb063a263fc33e09c250a731ec1bc0c4460258c49ff774d8c5e" Mar 12 12:49:32.562164 master-0 kubenswrapper[13984]: I0312 12:49:32.562106 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64fbc4c555-858hz"] Mar 12 12:49:32.578911 master-0 kubenswrapper[13984]: I0312 12:49:32.578846 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64fbc4c555-858hz"] Mar 12 12:49:32.590872 master-0 kubenswrapper[13984]: I0312 12:49:32.590836 13984 scope.go:117] "RemoveContainer" containerID="3025f131d59f0777650800ba419a85418d16f72c075108ce2a2e97b05f16ac97" Mar 12 12:49:32.922374 master-0 kubenswrapper[13984]: I0312 12:49:32.922318 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:33.021024 master-0 kubenswrapper[13984]: I0312 12:49:33.020958 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-combined-ca-bundle\") pod \"966ac026-07e7-4214-9c35-942fcd250899\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " Mar 12 12:49:33.021302 master-0 kubenswrapper[13984]: I0312 12:49:33.021280 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8n4j\" (UniqueName: \"kubernetes.io/projected/966ac026-07e7-4214-9c35-942fcd250899-kube-api-access-b8n4j\") pod \"966ac026-07e7-4214-9c35-942fcd250899\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " Mar 12 12:49:33.021577 master-0 kubenswrapper[13984]: I0312 12:49:33.021555 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-scripts\") pod \"966ac026-07e7-4214-9c35-942fcd250899\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " Mar 12 12:49:33.021642 master-0 kubenswrapper[13984]: I0312 12:49:33.021592 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-config-data\") pod \"966ac026-07e7-4214-9c35-942fcd250899\" (UID: \"966ac026-07e7-4214-9c35-942fcd250899\") " Mar 12 12:49:33.024226 master-0 kubenswrapper[13984]: I0312 12:49:33.024167 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966ac026-07e7-4214-9c35-942fcd250899-kube-api-access-b8n4j" (OuterVolumeSpecName: "kube-api-access-b8n4j") pod "966ac026-07e7-4214-9c35-942fcd250899" (UID: "966ac026-07e7-4214-9c35-942fcd250899"). InnerVolumeSpecName "kube-api-access-b8n4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:33.024964 master-0 kubenswrapper[13984]: I0312 12:49:33.024900 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-scripts" (OuterVolumeSpecName: "scripts") pod "966ac026-07e7-4214-9c35-942fcd250899" (UID: "966ac026-07e7-4214-9c35-942fcd250899"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:33.049198 master-0 kubenswrapper[13984]: I0312 12:49:33.049112 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "966ac026-07e7-4214-9c35-942fcd250899" (UID: "966ac026-07e7-4214-9c35-942fcd250899"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:33.063266 master-0 kubenswrapper[13984]: I0312 12:49:33.063215 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-config-data" (OuterVolumeSpecName: "config-data") pod "966ac026-07e7-4214-9c35-942fcd250899" (UID: "966ac026-07e7-4214-9c35-942fcd250899"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:33.123951 master-0 kubenswrapper[13984]: I0312 12:49:33.123888 13984 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-scripts\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:33.123951 master-0 kubenswrapper[13984]: I0312 12:49:33.123933 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:33.123951 master-0 kubenswrapper[13984]: I0312 12:49:33.123943 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/966ac026-07e7-4214-9c35-942fcd250899-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:33.123951 master-0 kubenswrapper[13984]: I0312 12:49:33.123953 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8n4j\" (UniqueName: \"kubernetes.io/projected/966ac026-07e7-4214-9c35-942fcd250899-kube-api-access-b8n4j\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:33.539415 master-0 kubenswrapper[13984]: I0312 12:49:33.539343 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-52nsz" event={"ID":"966ac026-07e7-4214-9c35-942fcd250899","Type":"ContainerDied","Data":"9761364bff8e9466d3da760d37ea141e3fd0be8c50b1c732556cc8609796f0de"} Mar 12 12:49:33.539974 master-0 kubenswrapper[13984]: I0312 12:49:33.539445 13984 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9761364bff8e9466d3da760d37ea141e3fd0be8c50b1c732556cc8609796f0de" Mar 12 12:49:33.539974 master-0 kubenswrapper[13984]: I0312 12:49:33.539520 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-52nsz" Mar 12 12:49:33.745824 master-0 kubenswrapper[13984]: I0312 12:49:33.745760 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:33.746059 master-0 kubenswrapper[13984]: I0312 12:49:33.746013 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-log" containerID="cri-o://9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa" gracePeriod=30 Mar 12 12:49:33.746661 master-0 kubenswrapper[13984]: I0312 12:49:33.746617 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-api" containerID="cri-o://87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541" gracePeriod=30 Mar 12 12:49:33.769286 master-0 kubenswrapper[13984]: I0312 12:49:33.769230 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:33.769531 master-0 kubenswrapper[13984]: I0312 12:49:33.769446 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" containerName="nova-scheduler-scheduler" containerID="cri-o://63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" gracePeriod=30 Mar 12 12:49:33.797369 master-0 kubenswrapper[13984]: I0312 12:49:33.796761 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:49:33.797369 master-0 kubenswrapper[13984]: I0312 12:49:33.796997 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-log" containerID="cri-o://681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4" gracePeriod=30 Mar 12 12:49:33.797369 master-0 kubenswrapper[13984]: I0312 12:49:33.797138 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-metadata" containerID="cri-o://1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f" gracePeriod=30 Mar 12 12:49:33.997466 master-0 kubenswrapper[13984]: I0312 12:49:33.997398 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" path="/var/lib/kubelet/pods/cc025fea-4c26-4327-b33c-0f29566b6b50/volumes" Mar 12 12:49:34.434940 master-0 kubenswrapper[13984]: I0312 12:49:34.434883 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:34.555606 master-0 kubenswrapper[13984]: I0312 12:49:34.555561 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djrd8\" (UniqueName: \"kubernetes.io/projected/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-kube-api-access-djrd8\") pod \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " Mar 12 12:49:34.556267 master-0 kubenswrapper[13984]: I0312 12:49:34.555699 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-config-data\") pod \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " Mar 12 12:49:34.556267 master-0 kubenswrapper[13984]: I0312 12:49:34.555810 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-combined-ca-bundle\") pod \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " Mar 12 12:49:34.556267 master-0 kubenswrapper[13984]: I0312 12:49:34.555934 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-internal-tls-certs\") pod \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " Mar 12 12:49:34.556267 master-0 kubenswrapper[13984]: I0312 12:49:34.555980 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-public-tls-certs\") pod \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " Mar 12 12:49:34.556267 master-0 kubenswrapper[13984]: I0312 12:49:34.556039 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-logs\") pod \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\" (UID: \"fbaa7f55-ae33-43a0-945c-0a6988ea56fa\") " Mar 12 12:49:34.557044 master-0 kubenswrapper[13984]: I0312 12:49:34.557014 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-logs" (OuterVolumeSpecName: "logs") pod "fbaa7f55-ae33-43a0-945c-0a6988ea56fa" (UID: "fbaa7f55-ae33-43a0-945c-0a6988ea56fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:49:34.561292 master-0 kubenswrapper[13984]: I0312 12:49:34.561224 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-kube-api-access-djrd8" (OuterVolumeSpecName: "kube-api-access-djrd8") pod "fbaa7f55-ae33-43a0-945c-0a6988ea56fa" (UID: "fbaa7f55-ae33-43a0-945c-0a6988ea56fa"). InnerVolumeSpecName "kube-api-access-djrd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:34.574906 master-0 kubenswrapper[13984]: I0312 12:49:34.574750 13984 generic.go:334] "Generic (PLEG): container finished" podID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerID="681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4" exitCode=143 Mar 12 12:49:34.574906 master-0 kubenswrapper[13984]: I0312 12:49:34.574837 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04e2132b-24d6-45c5-b5ce-728ac475768a","Type":"ContainerDied","Data":"681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4"} Mar 12 12:49:34.577149 master-0 kubenswrapper[13984]: I0312 12:49:34.577108 13984 generic.go:334] "Generic (PLEG): container finished" podID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerID="87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541" exitCode=0 Mar 12 12:49:34.577149 master-0 kubenswrapper[13984]: I0312 12:49:34.577132 13984 generic.go:334] "Generic (PLEG): container finished" podID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerID="9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa" exitCode=143 Mar 12 12:49:34.577344 master-0 kubenswrapper[13984]: I0312 12:49:34.577179 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbaa7f55-ae33-43a0-945c-0a6988ea56fa","Type":"ContainerDied","Data":"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541"} Mar 12 12:49:34.577344 master-0 kubenswrapper[13984]: I0312 12:49:34.577211 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbaa7f55-ae33-43a0-945c-0a6988ea56fa","Type":"ContainerDied","Data":"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa"} Mar 12 12:49:34.577344 master-0 kubenswrapper[13984]: I0312 12:49:34.577245 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"fbaa7f55-ae33-43a0-945c-0a6988ea56fa","Type":"ContainerDied","Data":"3123ff0816b28abdbf68d97e18b7f1bc8abb9378a14c182af51a2046eeb775b6"} Mar 12 12:49:34.577344 master-0 kubenswrapper[13984]: I0312 12:49:34.577266 13984 scope.go:117] "RemoveContainer" containerID="87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541" Mar 12 12:49:34.577560 master-0 kubenswrapper[13984]: I0312 12:49:34.577455 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:34.593129 master-0 kubenswrapper[13984]: I0312 12:49:34.593060 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-config-data" (OuterVolumeSpecName: "config-data") pod "fbaa7f55-ae33-43a0-945c-0a6988ea56fa" (UID: "fbaa7f55-ae33-43a0-945c-0a6988ea56fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:34.605464 master-0 kubenswrapper[13984]: I0312 12:49:34.605412 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fbaa7f55-ae33-43a0-945c-0a6988ea56fa" (UID: "fbaa7f55-ae33-43a0-945c-0a6988ea56fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:34.641019 master-0 kubenswrapper[13984]: I0312 12:49:34.640809 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fbaa7f55-ae33-43a0-945c-0a6988ea56fa" (UID: "fbaa7f55-ae33-43a0-945c-0a6988ea56fa"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:34.642974 master-0 kubenswrapper[13984]: I0312 12:49:34.641523 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fbaa7f55-ae33-43a0-945c-0a6988ea56fa" (UID: "fbaa7f55-ae33-43a0-945c-0a6988ea56fa"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:34.661796 master-0 kubenswrapper[13984]: I0312 12:49:34.659618 13984 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:34.661796 master-0 kubenswrapper[13984]: I0312 12:49:34.659670 13984 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:34.661796 master-0 kubenswrapper[13984]: I0312 12:49:34.659684 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:34.661796 master-0 kubenswrapper[13984]: I0312 12:49:34.659697 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djrd8\" (UniqueName: \"kubernetes.io/projected/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-kube-api-access-djrd8\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:34.661796 master-0 kubenswrapper[13984]: I0312 12:49:34.659709 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:34.661796 master-0 kubenswrapper[13984]: I0312 12:49:34.659720 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbaa7f55-ae33-43a0-945c-0a6988ea56fa-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:34.737624 master-0 kubenswrapper[13984]: I0312 12:49:34.737468 13984 scope.go:117] "RemoveContainer" containerID="9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa" Mar 12 12:49:34.766893 master-0 kubenswrapper[13984]: I0312 12:49:34.766715 13984 scope.go:117] "RemoveContainer" containerID="87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541" Mar 12 12:49:34.767995 master-0 kubenswrapper[13984]: E0312 12:49:34.767956 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541\": container with ID starting with 87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541 not found: ID does not exist" containerID="87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541" Mar 12 12:49:34.768082 master-0 kubenswrapper[13984]: I0312 12:49:34.768006 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541"} err="failed to get container status \"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541\": rpc error: code = NotFound desc = could not find container \"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541\": container with ID starting with 87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541 not found: ID does not exist" Mar 12 12:49:34.768082 master-0 kubenswrapper[13984]: I0312 12:49:34.768031 13984 scope.go:117] "RemoveContainer" containerID="9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa" Mar 12 12:49:34.768402 master-0 kubenswrapper[13984]: E0312 12:49:34.768380 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa\": container with ID starting with 9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa not found: ID does not exist" containerID="9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa" Mar 12 12:49:34.768504 master-0 kubenswrapper[13984]: I0312 12:49:34.768404 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa"} err="failed to get container status \"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa\": rpc error: code = NotFound desc = could not find container \"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa\": container with ID starting with 9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa not found: ID does not exist" Mar 12 12:49:34.768504 master-0 kubenswrapper[13984]: I0312 12:49:34.768419 13984 scope.go:117] "RemoveContainer" containerID="87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541" Mar 12 12:49:34.768855 master-0 kubenswrapper[13984]: I0312 12:49:34.768823 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541"} err="failed to get container status \"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541\": rpc error: code = NotFound desc = could not find container \"87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541\": container with ID starting with 87b2f77bb88bd4e9fa48bb1fb552bcc3e4f4c2d94cd3c400c2d103f0cfe9d541 not found: ID does not exist" Mar 12 12:49:34.768948 master-0 kubenswrapper[13984]: I0312 12:49:34.768855 13984 scope.go:117] "RemoveContainer" containerID="9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa" Mar 12 12:49:34.769180 master-0 kubenswrapper[13984]: I0312 12:49:34.769155 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa"} err="failed to get container status \"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa\": rpc error: code = NotFound desc = could not find container \"9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa\": container with ID starting with 9a096193e8070c6b114d1f5fa7d4e564f2649f80e50996eb47d215149f68fafa not found: ID does not exist" Mar 12 12:49:34.942516 master-0 kubenswrapper[13984]: I0312 12:49:34.942272 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:34.972503 master-0 kubenswrapper[13984]: I0312 12:49:34.969169 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:34.992681 master-0 kubenswrapper[13984]: I0312 12:49:34.992610 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:34.993131 master-0 kubenswrapper[13984]: E0312 12:49:34.993100 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-api" Mar 12 12:49:34.993131 master-0 kubenswrapper[13984]: I0312 12:49:34.993119 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-api" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: E0312 12:49:34.993139 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966ac026-07e7-4214-9c35-942fcd250899" containerName="nova-manage" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: I0312 12:49:34.993146 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="966ac026-07e7-4214-9c35-942fcd250899" containerName="nova-manage" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: E0312 12:49:34.993159 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerName="dnsmasq-dns" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: I0312 12:49:34.993166 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerName="dnsmasq-dns" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: E0312 12:49:34.993183 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-log" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: I0312 12:49:34.993188 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-log" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: E0312 12:49:34.993199 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc21a66e-84d4-4147-a050-e03b4cfc5ddd" containerName="nova-manage" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: I0312 12:49:34.993204 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc21a66e-84d4-4147-a050-e03b4cfc5ddd" containerName="nova-manage" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: E0312 12:49:34.993230 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerName="init" Mar 12 12:49:34.993268 master-0 kubenswrapper[13984]: I0312 12:49:34.993236 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerName="init" Mar 12 12:49:34.993684 master-0 kubenswrapper[13984]: I0312 12:49:34.993464 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc025fea-4c26-4327-b33c-0f29566b6b50" containerName="dnsmasq-dns" Mar 12 12:49:34.993684 master-0 kubenswrapper[13984]: I0312 12:49:34.993490 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-api" Mar 12 12:49:34.993684 master-0 kubenswrapper[13984]: I0312 12:49:34.993511 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" containerName="nova-api-log" Mar 12 12:49:34.993684 master-0 kubenswrapper[13984]: I0312 12:49:34.993527 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="966ac026-07e7-4214-9c35-942fcd250899" containerName="nova-manage" Mar 12 12:49:34.993684 master-0 kubenswrapper[13984]: I0312 12:49:34.993540 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc21a66e-84d4-4147-a050-e03b4cfc5ddd" containerName="nova-manage" Mar 12 12:49:34.994687 master-0 kubenswrapper[13984]: I0312 12:49:34.994660 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:34.997032 master-0 kubenswrapper[13984]: I0312 12:49:34.996982 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 12 12:49:34.997198 master-0 kubenswrapper[13984]: I0312 12:49:34.997152 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 12 12:49:34.999507 master-0 kubenswrapper[13984]: I0312 12:49:34.999003 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 12 12:49:35.006024 master-0 kubenswrapper[13984]: I0312 12:49:35.005963 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:35.078541 master-0 kubenswrapper[13984]: I0312 12:49:35.078421 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.078541 master-0 kubenswrapper[13984]: I0312 12:49:35.078519 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdqq6\" (UniqueName: \"kubernetes.io/projected/975b92c1-fe1e-48a0-97a5-d9c4a520166a-kube-api-access-cdqq6\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.078755 master-0 kubenswrapper[13984]: I0312 12:49:35.078631 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-config-data\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.078755 master-0 kubenswrapper[13984]: I0312 12:49:35.078722 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.078893 master-0 kubenswrapper[13984]: I0312 12:49:35.078864 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-public-tls-certs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.079021 master-0 kubenswrapper[13984]: I0312 12:49:35.078994 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975b92c1-fe1e-48a0-97a5-d9c4a520166a-logs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.182545 master-0 kubenswrapper[13984]: I0312 12:49:35.182394 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdqq6\" (UniqueName: \"kubernetes.io/projected/975b92c1-fe1e-48a0-97a5-d9c4a520166a-kube-api-access-cdqq6\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.182545 master-0 kubenswrapper[13984]: I0312 12:49:35.182516 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-config-data\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.182765 master-0 kubenswrapper[13984]: I0312 12:49:35.182617 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.182765 master-0 kubenswrapper[13984]: I0312 12:49:35.182645 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-public-tls-certs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.182765 master-0 kubenswrapper[13984]: I0312 12:49:35.182705 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975b92c1-fe1e-48a0-97a5-d9c4a520166a-logs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.182871 master-0 kubenswrapper[13984]: I0312 12:49:35.182777 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.184522 master-0 kubenswrapper[13984]: I0312 12:49:35.184426 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/975b92c1-fe1e-48a0-97a5-d9c4a520166a-logs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.188337 master-0 kubenswrapper[13984]: I0312 12:49:35.188299 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-public-tls-certs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.189614 master-0 kubenswrapper[13984]: I0312 12:49:35.189561 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.192200 master-0 kubenswrapper[13984]: I0312 12:49:35.192164 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-config-data\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.196200 master-0 kubenswrapper[13984]: I0312 12:49:35.196162 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/975b92c1-fe1e-48a0-97a5-d9c4a520166a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.205295 master-0 kubenswrapper[13984]: I0312 12:49:35.205253 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdqq6\" (UniqueName: \"kubernetes.io/projected/975b92c1-fe1e-48a0-97a5-d9c4a520166a-kube-api-access-cdqq6\") pod \"nova-api-0\" (UID: \"975b92c1-fe1e-48a0-97a5-d9c4a520166a\") " pod="openstack/nova-api-0" Mar 12 12:49:35.362994 master-0 kubenswrapper[13984]: I0312 12:49:35.362944 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 12 12:49:35.813159 master-0 kubenswrapper[13984]: W0312 12:49:35.813076 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod975b92c1_fe1e_48a0_97a5_d9c4a520166a.slice/crio-bd79b7aa27c1b78da9e241c21cd8424e76ed53c132757200b6a1b362fb80d973 WatchSource:0}: Error finding container bd79b7aa27c1b78da9e241c21cd8424e76ed53c132757200b6a1b362fb80d973: Status 404 returned error can't find the container with id bd79b7aa27c1b78da9e241c21cd8424e76ed53c132757200b6a1b362fb80d973 Mar 12 12:49:35.818093 master-0 kubenswrapper[13984]: I0312 12:49:35.818022 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 12 12:49:35.995142 master-0 kubenswrapper[13984]: I0312 12:49:35.995090 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fbaa7f55-ae33-43a0-945c-0a6988ea56fa" path="/var/lib/kubelet/pods/fbaa7f55-ae33-43a0-945c-0a6988ea56fa/volumes" Mar 12 12:49:36.603854 master-0 kubenswrapper[13984]: I0312 12:49:36.603800 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"975b92c1-fe1e-48a0-97a5-d9c4a520166a","Type":"ContainerStarted","Data":"27a817f189d78fec41e7d7e9d2c461b3731c0bef3dae3db1c663b5c604fa1ad6"} Mar 12 12:49:36.603854 master-0 kubenswrapper[13984]: I0312 12:49:36.603855 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"975b92c1-fe1e-48a0-97a5-d9c4a520166a","Type":"ContainerStarted","Data":"8d099a81a2020adf8e481d0d403464cc31af0b3d9b1d6df988f3c5139a245495"} Mar 12 12:49:36.603854 master-0 kubenswrapper[13984]: I0312 12:49:36.603868 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"975b92c1-fe1e-48a0-97a5-d9c4a520166a","Type":"ContainerStarted","Data":"bd79b7aa27c1b78da9e241c21cd8424e76ed53c132757200b6a1b362fb80d973"} Mar 12 12:49:36.638392 master-0 kubenswrapper[13984]: I0312 12:49:36.638298 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.638277815 podStartE2EDuration="2.638277815s" podCreationTimestamp="2026-03-12 12:49:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:36.624362343 +0000 UTC m=+1508.822377855" watchObservedRunningTime="2026-03-12 12:49:36.638277815 +0000 UTC m=+1508.836293307" Mar 12 12:49:36.771031 master-0 kubenswrapper[13984]: E0312 12:49:36.770966 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 12:49:36.773089 master-0 kubenswrapper[13984]: E0312 12:49:36.772992 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 12:49:36.774362 master-0 kubenswrapper[13984]: E0312 12:49:36.774325 13984 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 12 12:49:36.774421 master-0 kubenswrapper[13984]: E0312 12:49:36.774388 13984 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" containerName="nova-scheduler-scheduler" Mar 12 12:49:36.932025 master-0 kubenswrapper[13984]: I0312 12:49:36.931855 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.0.251:8775/\": read tcp 10.128.0.2:42158->10.128.0.251:8775: read: connection reset by peer" Mar 12 12:49:36.932025 master-0 kubenswrapper[13984]: I0312 12:49:36.931934 13984 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.0.251:8775/\": read tcp 10.128.0.2:42172->10.128.0.251:8775: read: connection reset by peer" Mar 12 12:49:37.532651 master-0 kubenswrapper[13984]: I0312 12:49:37.532592 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:49:37.643502 master-0 kubenswrapper[13984]: I0312 12:49:37.643017 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-config-data\") pod \"04e2132b-24d6-45c5-b5ce-728ac475768a\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " Mar 12 12:49:37.643502 master-0 kubenswrapper[13984]: I0312 12:49:37.643152 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-nova-metadata-tls-certs\") pod \"04e2132b-24d6-45c5-b5ce-728ac475768a\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " Mar 12 12:49:37.643502 master-0 kubenswrapper[13984]: I0312 12:49:37.643182 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2gkg\" (UniqueName: \"kubernetes.io/projected/04e2132b-24d6-45c5-b5ce-728ac475768a-kube-api-access-n2gkg\") pod \"04e2132b-24d6-45c5-b5ce-728ac475768a\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " Mar 12 12:49:37.643502 master-0 kubenswrapper[13984]: I0312 12:49:37.643207 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2132b-24d6-45c5-b5ce-728ac475768a-logs\") pod \"04e2132b-24d6-45c5-b5ce-728ac475768a\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " Mar 12 12:49:37.643502 master-0 kubenswrapper[13984]: I0312 12:49:37.643242 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-combined-ca-bundle\") pod \"04e2132b-24d6-45c5-b5ce-728ac475768a\" (UID: \"04e2132b-24d6-45c5-b5ce-728ac475768a\") " Mar 12 12:49:37.656460 master-0 kubenswrapper[13984]: I0312 12:49:37.655314 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04e2132b-24d6-45c5-b5ce-728ac475768a-logs" (OuterVolumeSpecName: "logs") pod "04e2132b-24d6-45c5-b5ce-728ac475768a" (UID: "04e2132b-24d6-45c5-b5ce-728ac475768a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 12 12:49:37.658660 master-0 kubenswrapper[13984]: I0312 12:49:37.658142 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04e2132b-24d6-45c5-b5ce-728ac475768a-kube-api-access-n2gkg" (OuterVolumeSpecName: "kube-api-access-n2gkg") pod "04e2132b-24d6-45c5-b5ce-728ac475768a" (UID: "04e2132b-24d6-45c5-b5ce-728ac475768a"). InnerVolumeSpecName "kube-api-access-n2gkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:37.658660 master-0 kubenswrapper[13984]: I0312 12:49:37.658202 13984 generic.go:334] "Generic (PLEG): container finished" podID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerID="1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f" exitCode=0 Mar 12 12:49:37.658660 master-0 kubenswrapper[13984]: I0312 12:49:37.658254 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:49:37.658660 master-0 kubenswrapper[13984]: I0312 12:49:37.658305 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04e2132b-24d6-45c5-b5ce-728ac475768a","Type":"ContainerDied","Data":"1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f"} Mar 12 12:49:37.658660 master-0 kubenswrapper[13984]: I0312 12:49:37.658334 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"04e2132b-24d6-45c5-b5ce-728ac475768a","Type":"ContainerDied","Data":"5562e65b31e97a9c6eb120c32b4647e8feaaafcc10286c8eecf918c6ab7bedae"} Mar 12 12:49:37.658660 master-0 kubenswrapper[13984]: I0312 12:49:37.658350 13984 scope.go:117] "RemoveContainer" containerID="1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f" Mar 12 12:49:37.683504 master-0 kubenswrapper[13984]: I0312 12:49:37.683377 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04e2132b-24d6-45c5-b5ce-728ac475768a" (UID: "04e2132b-24d6-45c5-b5ce-728ac475768a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:37.722708 master-0 kubenswrapper[13984]: I0312 12:49:37.718712 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "04e2132b-24d6-45c5-b5ce-728ac475768a" (UID: "04e2132b-24d6-45c5-b5ce-728ac475768a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:37.732730 master-0 kubenswrapper[13984]: I0312 12:49:37.732678 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-config-data" (OuterVolumeSpecName: "config-data") pod "04e2132b-24d6-45c5-b5ce-728ac475768a" (UID: "04e2132b-24d6-45c5-b5ce-728ac475768a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:37.745581 master-0 kubenswrapper[13984]: I0312 12:49:37.745521 13984 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:37.745581 master-0 kubenswrapper[13984]: I0312 12:49:37.745565 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n2gkg\" (UniqueName: \"kubernetes.io/projected/04e2132b-24d6-45c5-b5ce-728ac475768a-kube-api-access-n2gkg\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:37.745581 master-0 kubenswrapper[13984]: I0312 12:49:37.745574 13984 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/04e2132b-24d6-45c5-b5ce-728ac475768a-logs\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:37.745581 master-0 kubenswrapper[13984]: I0312 12:49:37.745585 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:37.745581 master-0 kubenswrapper[13984]: I0312 12:49:37.745594 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04e2132b-24d6-45c5-b5ce-728ac475768a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:37.845208 master-0 kubenswrapper[13984]: I0312 12:49:37.845159 13984 scope.go:117] "RemoveContainer" containerID="681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4" Mar 12 12:49:37.877253 master-0 kubenswrapper[13984]: I0312 12:49:37.877216 13984 scope.go:117] "RemoveContainer" containerID="1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f" Mar 12 12:49:37.877881 master-0 kubenswrapper[13984]: E0312 12:49:37.877844 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f\": container with ID starting with 1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f not found: ID does not exist" containerID="1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f" Mar 12 12:49:37.877940 master-0 kubenswrapper[13984]: I0312 12:49:37.877874 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f"} err="failed to get container status \"1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f\": rpc error: code = NotFound desc = could not find container \"1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f\": container with ID starting with 1631a2ccde6222681a6e39271b9b043cdc45ec5d622621b8e5f74224b8943f5f not found: ID does not exist" Mar 12 12:49:37.877940 master-0 kubenswrapper[13984]: I0312 12:49:37.877898 13984 scope.go:117] "RemoveContainer" containerID="681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4" Mar 12 12:49:37.878376 master-0 kubenswrapper[13984]: E0312 12:49:37.878327 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4\": container with ID starting with 681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4 not found: ID does not exist" containerID="681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4" Mar 12 12:49:37.878430 master-0 kubenswrapper[13984]: I0312 12:49:37.878376 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4"} err="failed to get container status \"681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4\": rpc error: code = NotFound desc = could not find container \"681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4\": container with ID starting with 681cfff3bcfe4e05bd0fb757c0674e14e43a5f47065bbe3f8073c0274c29bca4 not found: ID does not exist" Mar 12 12:49:38.012086 master-0 kubenswrapper[13984]: I0312 12:49:38.011982 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:49:38.026538 master-0 kubenswrapper[13984]: I0312 12:49:38.026467 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:49:38.051663 master-0 kubenswrapper[13984]: I0312 12:49:38.051600 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: E0312 12:49:38.052178 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-log" Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: I0312 12:49:38.052198 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-log" Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: E0312 12:49:38.052242 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-metadata" Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: I0312 12:49:38.052249 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-metadata" Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: I0312 12:49:38.052454 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-log" Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: I0312 12:49:38.052526 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" containerName="nova-metadata-metadata" Mar 12 12:49:38.054578 master-0 kubenswrapper[13984]: I0312 12:49:38.053836 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:49:38.060943 master-0 kubenswrapper[13984]: I0312 12:49:38.060758 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 12 12:49:38.061565 master-0 kubenswrapper[13984]: I0312 12:49:38.060961 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 12 12:49:38.080517 master-0 kubenswrapper[13984]: I0312 12:49:38.080439 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:49:38.155203 master-0 kubenswrapper[13984]: I0312 12:49:38.155123 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxqbg\" (UniqueName: \"kubernetes.io/projected/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-kube-api-access-lxqbg\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.155455 master-0 kubenswrapper[13984]: I0312 12:49:38.155260 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.155455 master-0 kubenswrapper[13984]: I0312 12:49:38.155317 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-logs\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.155571 master-0 kubenswrapper[13984]: I0312 12:49:38.155522 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-config-data\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.155571 master-0 kubenswrapper[13984]: I0312 12:49:38.155552 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.257789 master-0 kubenswrapper[13984]: I0312 12:49:38.257648 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-config-data\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.257789 master-0 kubenswrapper[13984]: I0312 12:49:38.257716 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.258722 master-0 kubenswrapper[13984]: I0312 12:49:38.258685 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxqbg\" (UniqueName: \"kubernetes.io/projected/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-kube-api-access-lxqbg\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.258992 master-0 kubenswrapper[13984]: I0312 12:49:38.258966 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.259039 master-0 kubenswrapper[13984]: I0312 12:49:38.259022 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-logs\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.259823 master-0 kubenswrapper[13984]: I0312 12:49:38.259765 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-logs\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.262447 master-0 kubenswrapper[13984]: I0312 12:49:38.262351 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-config-data\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.263521 master-0 kubenswrapper[13984]: I0312 12:49:38.262795 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.263737 master-0 kubenswrapper[13984]: I0312 12:49:38.263696 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.275916 master-0 kubenswrapper[13984]: I0312 12:49:38.275852 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxqbg\" (UniqueName: \"kubernetes.io/projected/954b0e9a-51a8-42d4-b768-c3ac8407ac2f-kube-api-access-lxqbg\") pod \"nova-metadata-0\" (UID: \"954b0e9a-51a8-42d4-b768-c3ac8407ac2f\") " pod="openstack/nova-metadata-0" Mar 12 12:49:38.375355 master-0 kubenswrapper[13984]: I0312 12:49:38.375259 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 12 12:49:38.857677 master-0 kubenswrapper[13984]: I0312 12:49:38.857612 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 12 12:49:39.690760 master-0 kubenswrapper[13984]: I0312 12:49:39.690708 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"954b0e9a-51a8-42d4-b768-c3ac8407ac2f","Type":"ContainerStarted","Data":"6c45f0ab6764f995e4047b6bfd87e9ec37c2e1e62501684c4c18732deadc569f"} Mar 12 12:49:39.690760 master-0 kubenswrapper[13984]: I0312 12:49:39.690764 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"954b0e9a-51a8-42d4-b768-c3ac8407ac2f","Type":"ContainerStarted","Data":"05556975250f0c36cf8f75190e32085072b70013247025814d87cc6c35160b7c"} Mar 12 12:49:39.691427 master-0 kubenswrapper[13984]: I0312 12:49:39.690780 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"954b0e9a-51a8-42d4-b768-c3ac8407ac2f","Type":"ContainerStarted","Data":"d5c61df3c0608f87225e4e1ba404117b9bb1c9ae159aef9ac7ea01d5eaeaa232"} Mar 12 12:49:39.726110 master-0 kubenswrapper[13984]: I0312 12:49:39.726015 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.725994178 podStartE2EDuration="1.725994178s" podCreationTimestamp="2026-03-12 12:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:39.713766553 +0000 UTC m=+1511.911782065" watchObservedRunningTime="2026-03-12 12:49:39.725994178 +0000 UTC m=+1511.924009670" Mar 12 12:49:39.992079 master-0 kubenswrapper[13984]: I0312 12:49:39.992035 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04e2132b-24d6-45c5-b5ce-728ac475768a" path="/var/lib/kubelet/pods/04e2132b-24d6-45c5-b5ce-728ac475768a/volumes" Mar 12 12:49:41.419067 master-0 kubenswrapper[13984]: I0312 12:49:41.419032 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:41.536388 master-0 kubenswrapper[13984]: I0312 12:49:41.536301 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-config-data\") pod \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " Mar 12 12:49:41.536668 master-0 kubenswrapper[13984]: I0312 12:49:41.536500 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdzln\" (UniqueName: \"kubernetes.io/projected/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-kube-api-access-sdzln\") pod \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " Mar 12 12:49:41.536668 master-0 kubenswrapper[13984]: I0312 12:49:41.536587 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-combined-ca-bundle\") pod \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\" (UID: \"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e\") " Mar 12 12:49:41.541111 master-0 kubenswrapper[13984]: I0312 12:49:41.541062 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-kube-api-access-sdzln" (OuterVolumeSpecName: "kube-api-access-sdzln") pod "6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" (UID: "6c0b1ebb-6347-4e46-bf33-e7335cac9c2e"). InnerVolumeSpecName "kube-api-access-sdzln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:49:41.567288 master-0 kubenswrapper[13984]: I0312 12:49:41.567168 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-config-data" (OuterVolumeSpecName: "config-data") pod "6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" (UID: "6c0b1ebb-6347-4e46-bf33-e7335cac9c2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:41.570349 master-0 kubenswrapper[13984]: I0312 12:49:41.570313 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" (UID: "6c0b1ebb-6347-4e46-bf33-e7335cac9c2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:49:41.642255 master-0 kubenswrapper[13984]: I0312 12:49:41.642179 13984 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:41.642255 master-0 kubenswrapper[13984]: I0312 12:49:41.642239 13984 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-config-data\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:41.642255 master-0 kubenswrapper[13984]: I0312 12:49:41.642253 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdzln\" (UniqueName: \"kubernetes.io/projected/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e-kube-api-access-sdzln\") on node \"master-0\" DevicePath \"\"" Mar 12 12:49:41.734643 master-0 kubenswrapper[13984]: I0312 12:49:41.734587 13984 generic.go:334] "Generic (PLEG): container finished" podID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" exitCode=0 Mar 12 12:49:41.734643 master-0 kubenswrapper[13984]: I0312 12:49:41.734640 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e","Type":"ContainerDied","Data":"63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5"} Mar 12 12:49:41.734909 master-0 kubenswrapper[13984]: I0312 12:49:41.734672 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"6c0b1ebb-6347-4e46-bf33-e7335cac9c2e","Type":"ContainerDied","Data":"e4103d2c06521d28a7b327bb86567f40a041181b4dfa60c43c0095d49872b955"} Mar 12 12:49:41.734909 master-0 kubenswrapper[13984]: I0312 12:49:41.734694 13984 scope.go:117] "RemoveContainer" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" Mar 12 12:49:41.734909 master-0 kubenswrapper[13984]: I0312 12:49:41.734854 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:41.812535 master-0 kubenswrapper[13984]: I0312 12:49:41.812299 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:41.821573 master-0 kubenswrapper[13984]: I0312 12:49:41.818708 13984 scope.go:117] "RemoveContainer" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" Mar 12 12:49:41.825499 master-0 kubenswrapper[13984]: E0312 12:49:41.822055 13984 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5\": container with ID starting with 63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5 not found: ID does not exist" containerID="63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5" Mar 12 12:49:41.825499 master-0 kubenswrapper[13984]: I0312 12:49:41.822113 13984 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5"} err="failed to get container status \"63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5\": rpc error: code = NotFound desc = could not find container \"63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5\": container with ID starting with 63434716ec56d1a90426b7686bca28da9853488d49bfd1a46954229420660ac5 not found: ID does not exist" Mar 12 12:49:41.833497 master-0 kubenswrapper[13984]: I0312 12:49:41.831289 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:41.862496 master-0 kubenswrapper[13984]: I0312 12:49:41.861907 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:41.862712 master-0 kubenswrapper[13984]: E0312 12:49:41.862597 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" containerName="nova-scheduler-scheduler" Mar 12 12:49:41.862712 master-0 kubenswrapper[13984]: I0312 12:49:41.862617 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" containerName="nova-scheduler-scheduler" Mar 12 12:49:41.866495 master-0 kubenswrapper[13984]: I0312 12:49:41.862927 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" containerName="nova-scheduler-scheduler" Mar 12 12:49:41.866495 master-0 kubenswrapper[13984]: I0312 12:49:41.863988 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:41.874933 master-0 kubenswrapper[13984]: I0312 12:49:41.869382 13984 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 12 12:49:41.901519 master-0 kubenswrapper[13984]: I0312 12:49:41.899614 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:41.956535 master-0 kubenswrapper[13984]: I0312 12:49:41.953927 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e9fc6-888e-41b0-b034-cf71c4f47610-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:41.956535 master-0 kubenswrapper[13984]: I0312 12:49:41.953994 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s9lt\" (UniqueName: \"kubernetes.io/projected/be5e9fc6-888e-41b0-b034-cf71c4f47610-kube-api-access-6s9lt\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:41.956535 master-0 kubenswrapper[13984]: I0312 12:49:41.954224 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e9fc6-888e-41b0-b034-cf71c4f47610-config-data\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.000094 master-0 kubenswrapper[13984]: I0312 12:49:42.000035 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c0b1ebb-6347-4e46-bf33-e7335cac9c2e" path="/var/lib/kubelet/pods/6c0b1ebb-6347-4e46-bf33-e7335cac9c2e/volumes" Mar 12 12:49:42.056144 master-0 kubenswrapper[13984]: I0312 12:49:42.056059 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e9fc6-888e-41b0-b034-cf71c4f47610-config-data\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.056369 master-0 kubenswrapper[13984]: I0312 12:49:42.056230 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e9fc6-888e-41b0-b034-cf71c4f47610-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.056369 master-0 kubenswrapper[13984]: I0312 12:49:42.056266 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s9lt\" (UniqueName: \"kubernetes.io/projected/be5e9fc6-888e-41b0-b034-cf71c4f47610-kube-api-access-6s9lt\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.063407 master-0 kubenswrapper[13984]: I0312 12:49:42.063127 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/be5e9fc6-888e-41b0-b034-cf71c4f47610-config-data\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.063407 master-0 kubenswrapper[13984]: I0312 12:49:42.063174 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be5e9fc6-888e-41b0-b034-cf71c4f47610-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.074154 master-0 kubenswrapper[13984]: I0312 12:49:42.074059 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s9lt\" (UniqueName: \"kubernetes.io/projected/be5e9fc6-888e-41b0-b034-cf71c4f47610-kube-api-access-6s9lt\") pod \"nova-scheduler-0\" (UID: \"be5e9fc6-888e-41b0-b034-cf71c4f47610\") " pod="openstack/nova-scheduler-0" Mar 12 12:49:42.180394 master-0 kubenswrapper[13984]: I0312 12:49:42.180352 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 12 12:49:42.642986 master-0 kubenswrapper[13984]: I0312 12:49:42.637366 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 12 12:49:42.645541 master-0 kubenswrapper[13984]: W0312 12:49:42.644789 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5e9fc6_888e_41b0_b034_cf71c4f47610.slice/crio-609080012d729a23a5c8c458c62c981a7a473e31182b5d40c3ab546481b5322b WatchSource:0}: Error finding container 609080012d729a23a5c8c458c62c981a7a473e31182b5d40c3ab546481b5322b: Status 404 returned error can't find the container with id 609080012d729a23a5c8c458c62c981a7a473e31182b5d40c3ab546481b5322b Mar 12 12:49:42.752374 master-0 kubenswrapper[13984]: I0312 12:49:42.752292 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be5e9fc6-888e-41b0-b034-cf71c4f47610","Type":"ContainerStarted","Data":"609080012d729a23a5c8c458c62c981a7a473e31182b5d40c3ab546481b5322b"} Mar 12 12:49:43.376523 master-0 kubenswrapper[13984]: I0312 12:49:43.376385 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 12:49:43.376746 master-0 kubenswrapper[13984]: I0312 12:49:43.376535 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 12 12:49:43.771444 master-0 kubenswrapper[13984]: I0312 12:49:43.771387 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"be5e9fc6-888e-41b0-b034-cf71c4f47610","Type":"ContainerStarted","Data":"ff0909503eca73bed45a956a535c5207ead1d349347c79a2546be453130827d5"} Mar 12 12:49:43.795586 master-0 kubenswrapper[13984]: I0312 12:49:43.795502 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.795464393 podStartE2EDuration="2.795464393s" podCreationTimestamp="2026-03-12 12:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:49:43.789062107 +0000 UTC m=+1515.987077609" watchObservedRunningTime="2026-03-12 12:49:43.795464393 +0000 UTC m=+1515.993479885" Mar 12 12:49:45.364266 master-0 kubenswrapper[13984]: I0312 12:49:45.364169 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 12:49:45.364266 master-0 kubenswrapper[13984]: I0312 12:49:45.364263 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 12 12:49:46.378713 master-0 kubenswrapper[13984]: I0312 12:49:46.378639 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="975b92c1-fe1e-48a0-97a5-d9c4a520166a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.4:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:46.378713 master-0 kubenswrapper[13984]: I0312 12:49:46.378674 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="975b92c1-fe1e-48a0-97a5-d9c4a520166a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.4:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:47.181228 master-0 kubenswrapper[13984]: I0312 12:49:47.181169 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 12 12:49:48.376635 master-0 kubenswrapper[13984]: I0312 12:49:48.376524 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 12:49:48.376635 master-0 kubenswrapper[13984]: I0312 12:49:48.376594 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 12 12:49:49.388725 master-0 kubenswrapper[13984]: I0312 12:49:49.388665 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="954b0e9a-51a8-42d4-b768-c3ac8407ac2f" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.5:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:49.389436 master-0 kubenswrapper[13984]: I0312 12:49:49.388676 13984 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="954b0e9a-51a8-42d4-b768-c3ac8407ac2f" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.5:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 12 12:49:52.180702 master-0 kubenswrapper[13984]: I0312 12:49:52.180649 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 12 12:49:52.209136 master-0 kubenswrapper[13984]: I0312 12:49:52.209076 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 12 12:49:52.927763 master-0 kubenswrapper[13984]: I0312 12:49:52.927546 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 12 12:49:55.370096 master-0 kubenswrapper[13984]: I0312 12:49:55.370030 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 12:49:55.371803 master-0 kubenswrapper[13984]: I0312 12:49:55.371764 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 12:49:55.374017 master-0 kubenswrapper[13984]: I0312 12:49:55.373958 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 12 12:49:55.378156 master-0 kubenswrapper[13984]: I0312 12:49:55.378117 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 12:49:55.935290 master-0 kubenswrapper[13984]: I0312 12:49:55.935223 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 12 12:49:55.952246 master-0 kubenswrapper[13984]: I0312 12:49:55.952159 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 12 12:49:58.381755 master-0 kubenswrapper[13984]: I0312 12:49:58.381649 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 12:49:58.384619 master-0 kubenswrapper[13984]: I0312 12:49:58.384570 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 12 12:49:58.388905 master-0 kubenswrapper[13984]: I0312 12:49:58.388840 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 12:49:58.983181 master-0 kubenswrapper[13984]: I0312 12:49:58.983110 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 12 12:50:25.882263 master-0 kubenswrapper[13984]: I0312 12:50:25.881799 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-p6ggm"] Mar 12 12:50:25.882263 master-0 kubenswrapper[13984]: I0312 12:50:25.882062 13984 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" podUID="09acc31e-703b-4124-a43d-ee8d2ff94d96" containerName="sushy-emulator" containerID="cri-o://c70de9c4e88d3c97ad7ad479acb9a26b173cf5b25c3c9f3a396e702a946cd740" gracePeriod=30 Mar 12 12:50:26.334459 master-0 kubenswrapper[13984]: I0312 12:50:26.334377 13984 generic.go:334] "Generic (PLEG): container finished" podID="09acc31e-703b-4124-a43d-ee8d2ff94d96" containerID="c70de9c4e88d3c97ad7ad479acb9a26b173cf5b25c3c9f3a396e702a946cd740" exitCode=0 Mar 12 12:50:26.334459 master-0 kubenswrapper[13984]: I0312 12:50:26.334417 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" event={"ID":"09acc31e-703b-4124-a43d-ee8d2ff94d96","Type":"ContainerDied","Data":"c70de9c4e88d3c97ad7ad479acb9a26b173cf5b25c3c9f3a396e702a946cd740"} Mar 12 12:50:26.757562 master-0 kubenswrapper[13984]: I0312 12:50:26.757511 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:50:26.895282 master-0 kubenswrapper[13984]: I0312 12:50:26.895128 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-sqlnt"] Mar 12 12:50:26.895842 master-0 kubenswrapper[13984]: E0312 12:50:26.895724 13984 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09acc31e-703b-4124-a43d-ee8d2ff94d96" containerName="sushy-emulator" Mar 12 12:50:26.895842 master-0 kubenswrapper[13984]: I0312 12:50:26.895739 13984 state_mem.go:107] "Deleted CPUSet assignment" podUID="09acc31e-703b-4124-a43d-ee8d2ff94d96" containerName="sushy-emulator" Mar 12 12:50:26.896077 master-0 kubenswrapper[13984]: I0312 12:50:26.896048 13984 memory_manager.go:354] "RemoveStaleState removing state" podUID="09acc31e-703b-4124-a43d-ee8d2ff94d96" containerName="sushy-emulator" Mar 12 12:50:26.896936 master-0 kubenswrapper[13984]: I0312 12:50:26.896904 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:26.942066 master-0 kubenswrapper[13984]: I0312 12:50:26.912158 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-sqlnt"] Mar 12 12:50:26.945288 master-0 kubenswrapper[13984]: I0312 12:50:26.945243 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/09acc31e-703b-4124-a43d-ee8d2ff94d96-os-client-config\") pod \"09acc31e-703b-4124-a43d-ee8d2ff94d96\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " Mar 12 12:50:26.949083 master-0 kubenswrapper[13984]: I0312 12:50:26.948953 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7p8qh\" (UniqueName: \"kubernetes.io/projected/09acc31e-703b-4124-a43d-ee8d2ff94d96-kube-api-access-7p8qh\") pod \"09acc31e-703b-4124-a43d-ee8d2ff94d96\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " Mar 12 12:50:26.949200 master-0 kubenswrapper[13984]: I0312 12:50:26.949083 13984 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/09acc31e-703b-4124-a43d-ee8d2ff94d96-sushy-emulator-config\") pod \"09acc31e-703b-4124-a43d-ee8d2ff94d96\" (UID: \"09acc31e-703b-4124-a43d-ee8d2ff94d96\") " Mar 12 12:50:26.949741 master-0 kubenswrapper[13984]: I0312 12:50:26.949700 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85xnx\" (UniqueName: \"kubernetes.io/projected/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-kube-api-access-85xnx\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:26.950028 master-0 kubenswrapper[13984]: I0312 12:50:26.949993 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-os-client-config\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:26.950543 master-0 kubenswrapper[13984]: I0312 12:50:26.950506 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:26.951751 master-0 kubenswrapper[13984]: I0312 12:50:26.951716 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09acc31e-703b-4124-a43d-ee8d2ff94d96-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "09acc31e-703b-4124-a43d-ee8d2ff94d96" (UID: "09acc31e-703b-4124-a43d-ee8d2ff94d96"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 12 12:50:26.953351 master-0 kubenswrapper[13984]: I0312 12:50:26.953313 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09acc31e-703b-4124-a43d-ee8d2ff94d96-kube-api-access-7p8qh" (OuterVolumeSpecName: "kube-api-access-7p8qh") pod "09acc31e-703b-4124-a43d-ee8d2ff94d96" (UID: "09acc31e-703b-4124-a43d-ee8d2ff94d96"). InnerVolumeSpecName "kube-api-access-7p8qh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 12 12:50:26.953608 master-0 kubenswrapper[13984]: I0312 12:50:26.953534 13984 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09acc31e-703b-4124-a43d-ee8d2ff94d96-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "09acc31e-703b-4124-a43d-ee8d2ff94d96" (UID: "09acc31e-703b-4124-a43d-ee8d2ff94d96"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 12 12:50:27.053040 master-0 kubenswrapper[13984]: I0312 12:50:27.052998 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.053375 master-0 kubenswrapper[13984]: I0312 12:50:27.053351 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85xnx\" (UniqueName: \"kubernetes.io/projected/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-kube-api-access-85xnx\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.053677 master-0 kubenswrapper[13984]: I0312 12:50:27.053656 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-os-client-config\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.054313 master-0 kubenswrapper[13984]: I0312 12:50:27.054268 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.054313 master-0 kubenswrapper[13984]: I0312 12:50:27.054291 13984 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/09acc31e-703b-4124-a43d-ee8d2ff94d96-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:50:27.054457 master-0 kubenswrapper[13984]: I0312 12:50:27.054345 13984 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7p8qh\" (UniqueName: \"kubernetes.io/projected/09acc31e-703b-4124-a43d-ee8d2ff94d96-kube-api-access-7p8qh\") on node \"master-0\" DevicePath \"\"" Mar 12 12:50:27.054457 master-0 kubenswrapper[13984]: I0312 12:50:27.054365 13984 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/09acc31e-703b-4124-a43d-ee8d2ff94d96-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 12 12:50:27.057505 master-0 kubenswrapper[13984]: I0312 12:50:27.057457 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-os-client-config\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.069011 master-0 kubenswrapper[13984]: I0312 12:50:27.068980 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85xnx\" (UniqueName: \"kubernetes.io/projected/76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe-kube-api-access-85xnx\") pod \"sushy-emulator-84965d5d88-sqlnt\" (UID: \"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe\") " pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.343886 master-0 kubenswrapper[13984]: I0312 12:50:27.343820 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:27.346558 master-0 kubenswrapper[13984]: I0312 12:50:27.346515 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" event={"ID":"09acc31e-703b-4124-a43d-ee8d2ff94d96","Type":"ContainerDied","Data":"29ca8b52a193b372234a4b1e4080ac30bd76569ed3c77f5736a667a923ef5d6b"} Mar 12 12:50:27.346651 master-0 kubenswrapper[13984]: I0312 12:50:27.346567 13984 scope.go:117] "RemoveContainer" containerID="c70de9c4e88d3c97ad7ad479acb9a26b173cf5b25c3c9f3a396e702a946cd740" Mar 12 12:50:27.346651 master-0 kubenswrapper[13984]: I0312 12:50:27.346633 13984 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-p6ggm" Mar 12 12:50:27.540872 master-0 kubenswrapper[13984]: I0312 12:50:27.540787 13984 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-p6ggm"] Mar 12 12:50:27.560777 master-0 kubenswrapper[13984]: I0312 12:50:27.560648 13984 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-p6ggm"] Mar 12 12:50:27.935836 master-0 kubenswrapper[13984]: I0312 12:50:27.935741 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-sqlnt"] Mar 12 12:50:27.997941 master-0 kubenswrapper[13984]: I0312 12:50:27.995204 13984 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09acc31e-703b-4124-a43d-ee8d2ff94d96" path="/var/lib/kubelet/pods/09acc31e-703b-4124-a43d-ee8d2ff94d96/volumes" Mar 12 12:50:28.363258 master-0 kubenswrapper[13984]: I0312 12:50:28.363195 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" event={"ID":"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe","Type":"ContainerStarted","Data":"c6b6cac207b5d3c9d463984c33432ff00de7efcb1f1336884b167a1cff3f284f"} Mar 12 12:50:28.363258 master-0 kubenswrapper[13984]: I0312 12:50:28.363242 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" event={"ID":"76b1de9d-2aa5-4c73-bae5-c133b3aa3fbe","Type":"ContainerStarted","Data":"347c1c45d3c96f53a71ee202c2808ed7a0c9b3f5eb5c37cf63fe0e8f38eb2713"} Mar 12 12:50:28.401387 master-0 kubenswrapper[13984]: I0312 12:50:28.400600 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" podStartSLOduration=2.40056561 podStartE2EDuration="2.40056561s" podCreationTimestamp="2026-03-12 12:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:50:28.383633696 +0000 UTC m=+1560.581649198" watchObservedRunningTime="2026-03-12 12:50:28.40056561 +0000 UTC m=+1560.598581142" Mar 12 12:50:37.345635 master-0 kubenswrapper[13984]: I0312 12:50:37.345558 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:37.346287 master-0 kubenswrapper[13984]: I0312 12:50:37.345656 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:37.358415 master-0 kubenswrapper[13984]: I0312 12:50:37.358335 13984 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:50:37.477435 master-0 kubenswrapper[13984]: I0312 12:50:37.477390 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-84965d5d88-sqlnt" Mar 12 12:51:42.346464 master-0 kubenswrapper[13984]: I0312 12:51:42.346400 13984 scope.go:117] "RemoveContainer" containerID="97baea7771b388f78ed90c0c95190d2450394fc3f19b93d18c4699807735a2fe" Mar 12 12:52:42.404173 master-0 kubenswrapper[13984]: I0312 12:52:42.404084 13984 scope.go:117] "RemoveContainer" containerID="2cd511b3f5082cee0c8dab442e8c47fbf12993f38966dcf97f415e18b360a966" Mar 12 12:52:42.453502 master-0 kubenswrapper[13984]: I0312 12:52:42.453418 13984 scope.go:117] "RemoveContainer" containerID="7b13f30cb13c59942bbe65a5bd4ac18816f0e5964d563ec4664fe36103c129af" Mar 12 12:54:32.824431 master-0 kubenswrapper[13984]: I0312 12:54:32.824373 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kbz5s/must-gather-7dmgs"] Mar 12 12:54:32.829815 master-0 kubenswrapper[13984]: I0312 12:54:32.826397 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:32.832690 master-0 kubenswrapper[13984]: I0312 12:54:32.832637 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kbz5s"/"openshift-service-ca.crt" Mar 12 12:54:32.832779 master-0 kubenswrapper[13984]: I0312 12:54:32.832644 13984 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-kbz5s"/"kube-root-ca.crt" Mar 12 12:54:32.837668 master-0 kubenswrapper[13984]: I0312 12:54:32.837601 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kbz5s/must-gather-6zs9x"] Mar 12 12:54:32.842060 master-0 kubenswrapper[13984]: I0312 12:54:32.842006 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:32.853928 master-0 kubenswrapper[13984]: I0312 12:54:32.853864 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kbz5s/must-gather-7dmgs"] Mar 12 12:54:32.867628 master-0 kubenswrapper[13984]: I0312 12:54:32.866889 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kbz5s/must-gather-6zs9x"] Mar 12 12:54:32.932579 master-0 kubenswrapper[13984]: I0312 12:54:32.931151 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a98af41-d790-4832-a89f-8cb27c7ce32a-must-gather-output\") pod \"must-gather-7dmgs\" (UID: \"0a98af41-d790-4832-a89f-8cb27c7ce32a\") " pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:32.932579 master-0 kubenswrapper[13984]: I0312 12:54:32.931267 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbspn\" (UniqueName: \"kubernetes.io/projected/0a98af41-d790-4832-a89f-8cb27c7ce32a-kube-api-access-jbspn\") pod \"must-gather-7dmgs\" (UID: \"0a98af41-d790-4832-a89f-8cb27c7ce32a\") " pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:32.932579 master-0 kubenswrapper[13984]: I0312 12:54:32.931330 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdjkr\" (UniqueName: \"kubernetes.io/projected/8a395679-a171-4620-83fd-5c52e499bfb1-kube-api-access-qdjkr\") pod \"must-gather-6zs9x\" (UID: \"8a395679-a171-4620-83fd-5c52e499bfb1\") " pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:32.932579 master-0 kubenswrapper[13984]: I0312 12:54:32.931427 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8a395679-a171-4620-83fd-5c52e499bfb1-must-gather-output\") pod \"must-gather-6zs9x\" (UID: \"8a395679-a171-4620-83fd-5c52e499bfb1\") " pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:33.034581 master-0 kubenswrapper[13984]: I0312 12:54:33.033336 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8a395679-a171-4620-83fd-5c52e499bfb1-must-gather-output\") pod \"must-gather-6zs9x\" (UID: \"8a395679-a171-4620-83fd-5c52e499bfb1\") " pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:33.034581 master-0 kubenswrapper[13984]: I0312 12:54:33.033428 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a98af41-d790-4832-a89f-8cb27c7ce32a-must-gather-output\") pod \"must-gather-7dmgs\" (UID: \"0a98af41-d790-4832-a89f-8cb27c7ce32a\") " pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:33.034581 master-0 kubenswrapper[13984]: I0312 12:54:33.033557 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbspn\" (UniqueName: \"kubernetes.io/projected/0a98af41-d790-4832-a89f-8cb27c7ce32a-kube-api-access-jbspn\") pod \"must-gather-7dmgs\" (UID: \"0a98af41-d790-4832-a89f-8cb27c7ce32a\") " pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:33.034581 master-0 kubenswrapper[13984]: I0312 12:54:33.033633 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdjkr\" (UniqueName: \"kubernetes.io/projected/8a395679-a171-4620-83fd-5c52e499bfb1-kube-api-access-qdjkr\") pod \"must-gather-6zs9x\" (UID: \"8a395679-a171-4620-83fd-5c52e499bfb1\") " pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:33.034961 master-0 kubenswrapper[13984]: I0312 12:54:33.034780 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a98af41-d790-4832-a89f-8cb27c7ce32a-must-gather-output\") pod \"must-gather-7dmgs\" (UID: \"0a98af41-d790-4832-a89f-8cb27c7ce32a\") " pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:33.036387 master-0 kubenswrapper[13984]: I0312 12:54:33.035664 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8a395679-a171-4620-83fd-5c52e499bfb1-must-gather-output\") pod \"must-gather-6zs9x\" (UID: \"8a395679-a171-4620-83fd-5c52e499bfb1\") " pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:33.061333 master-0 kubenswrapper[13984]: I0312 12:54:33.060279 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdjkr\" (UniqueName: \"kubernetes.io/projected/8a395679-a171-4620-83fd-5c52e499bfb1-kube-api-access-qdjkr\") pod \"must-gather-6zs9x\" (UID: \"8a395679-a171-4620-83fd-5c52e499bfb1\") " pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:33.065462 master-0 kubenswrapper[13984]: I0312 12:54:33.064462 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbspn\" (UniqueName: \"kubernetes.io/projected/0a98af41-d790-4832-a89f-8cb27c7ce32a-kube-api-access-jbspn\") pod \"must-gather-7dmgs\" (UID: \"0a98af41-d790-4832-a89f-8cb27c7ce32a\") " pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:33.144190 master-0 kubenswrapper[13984]: I0312 12:54:33.144046 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/must-gather-7dmgs" Mar 12 12:54:33.162459 master-0 kubenswrapper[13984]: I0312 12:54:33.162392 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/must-gather-6zs9x" Mar 12 12:54:33.753587 master-0 kubenswrapper[13984]: I0312 12:54:33.753523 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kbz5s/must-gather-6zs9x"] Mar 12 12:54:33.759157 master-0 kubenswrapper[13984]: W0312 12:54:33.759016 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a395679_a171_4620_83fd_5c52e499bfb1.slice/crio-88df0b6e8dbfbb156a61b1440a65f6b970519c89ea54b7058b1bbaba41024981 WatchSource:0}: Error finding container 88df0b6e8dbfbb156a61b1440a65f6b970519c89ea54b7058b1bbaba41024981: Status 404 returned error can't find the container with id 88df0b6e8dbfbb156a61b1440a65f6b970519c89ea54b7058b1bbaba41024981 Mar 12 12:54:33.768685 master-0 kubenswrapper[13984]: I0312 12:54:33.768233 13984 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 12 12:54:33.889503 master-0 kubenswrapper[13984]: I0312 12:54:33.889276 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kbz5s/must-gather-7dmgs"] Mar 12 12:54:33.893976 master-0 kubenswrapper[13984]: W0312 12:54:33.893928 13984 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a98af41_d790_4832_a89f_8cb27c7ce32a.slice/crio-95cf715887ef7946b220033e21a2f02870d0c8fbc27b34655d7d3a2664aa7b34 WatchSource:0}: Error finding container 95cf715887ef7946b220033e21a2f02870d0c8fbc27b34655d7d3a2664aa7b34: Status 404 returned error can't find the container with id 95cf715887ef7946b220033e21a2f02870d0c8fbc27b34655d7d3a2664aa7b34 Mar 12 12:54:34.544909 master-0 kubenswrapper[13984]: I0312 12:54:34.544846 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/must-gather-6zs9x" event={"ID":"8a395679-a171-4620-83fd-5c52e499bfb1","Type":"ContainerStarted","Data":"88df0b6e8dbfbb156a61b1440a65f6b970519c89ea54b7058b1bbaba41024981"} Mar 12 12:54:34.548098 master-0 kubenswrapper[13984]: I0312 12:54:34.547308 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/must-gather-7dmgs" event={"ID":"0a98af41-d790-4832-a89f-8cb27c7ce32a","Type":"ContainerStarted","Data":"95cf715887ef7946b220033e21a2f02870d0c8fbc27b34655d7d3a2664aa7b34"} Mar 12 12:54:35.565832 master-0 kubenswrapper[13984]: I0312 12:54:35.565747 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/must-gather-7dmgs" event={"ID":"0a98af41-d790-4832-a89f-8cb27c7ce32a","Type":"ContainerStarted","Data":"e6ca9b6cb7438ec8f889fc737cbc64fd1dded5f0ef78a95cdde61f2b774ba540"} Mar 12 12:54:36.579840 master-0 kubenswrapper[13984]: I0312 12:54:36.579782 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/must-gather-7dmgs" event={"ID":"0a98af41-d790-4832-a89f-8cb27c7ce32a","Type":"ContainerStarted","Data":"ca58db710aadc1076a24b22c59ebcb6d521448784d28a75ddcf93dca2c20b082"} Mar 12 12:54:36.649692 master-0 kubenswrapper[13984]: I0312 12:54:36.649615 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kbz5s/must-gather-7dmgs" podStartSLOduration=3.514133477 podStartE2EDuration="4.647990847s" podCreationTimestamp="2026-03-12 12:54:32 +0000 UTC" firstStartedPulling="2026-03-12 12:54:33.89666833 +0000 UTC m=+1806.094683822" lastFinishedPulling="2026-03-12 12:54:35.0305257 +0000 UTC m=+1807.228541192" observedRunningTime="2026-03-12 12:54:36.619367772 +0000 UTC m=+1808.817383324" watchObservedRunningTime="2026-03-12 12:54:36.647990847 +0000 UTC m=+1808.846006339" Mar 12 12:54:38.037365 master-0 kubenswrapper[13984]: I0312 12:54:38.037302 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-7dg9w_114b1d16-b37d-449c-84e3-3fb3f8b20eaa/cluster-version-operator/1.log" Mar 12 12:54:38.201280 master-0 kubenswrapper[13984]: I0312 12:54:38.201214 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-7dg9w_114b1d16-b37d-449c-84e3-3fb3f8b20eaa/cluster-version-operator/0.log" Mar 12 12:54:42.714709 master-0 kubenswrapper[13984]: I0312 12:54:42.714660 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/must-gather-6zs9x" event={"ID":"8a395679-a171-4620-83fd-5c52e499bfb1","Type":"ContainerStarted","Data":"97f40d929414feff9c6b9616bc345e5f3a6333bac3c2bfbf5566eb85bcc17dbe"} Mar 12 12:54:42.715275 master-0 kubenswrapper[13984]: I0312 12:54:42.715257 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/must-gather-6zs9x" event={"ID":"8a395679-a171-4620-83fd-5c52e499bfb1","Type":"ContainerStarted","Data":"035b3267bab040912427936f58c566d9964a69d896318de9b349c89655a2351a"} Mar 12 12:54:42.806580 master-0 kubenswrapper[13984]: I0312 12:54:42.805782 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kbz5s/must-gather-6zs9x" podStartSLOduration=3.256569126 podStartE2EDuration="10.805764776s" podCreationTimestamp="2026-03-12 12:54:32 +0000 UTC" firstStartedPulling="2026-03-12 12:54:33.768185403 +0000 UTC m=+1805.966200895" lastFinishedPulling="2026-03-12 12:54:41.317381043 +0000 UTC m=+1813.515396545" observedRunningTime="2026-03-12 12:54:42.78429462 +0000 UTC m=+1814.982310112" watchObservedRunningTime="2026-03-12 12:54:42.805764776 +0000 UTC m=+1815.003780258" Mar 12 12:54:43.184052 master-0 kubenswrapper[13984]: I0312 12:54:43.183959 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-9n27b_2a98b253-d5ac-4e39-ac14-d24c5dad3ec2/nmstate-console-plugin/0.log" Mar 12 12:54:43.214520 master-0 kubenswrapper[13984]: I0312 12:54:43.214057 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-tpsct_16b99c26-f5eb-4c89-ac3c-e08d2a3a9638/nmstate-handler/0.log" Mar 12 12:54:43.230680 master-0 kubenswrapper[13984]: I0312 12:54:43.230633 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-hmdgg_8e356084-3bba-4665-b579-fb8e0a999783/nmstate-metrics/0.log" Mar 12 12:54:43.267981 master-0 kubenswrapper[13984]: I0312 12:54:43.265947 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-hmdgg_8e356084-3bba-4665-b579-fb8e0a999783/kube-rbac-proxy/0.log" Mar 12 12:54:43.444517 master-0 kubenswrapper[13984]: I0312 12:54:43.443336 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-l6mg5_2c044b22-cc0c-4ec1-b0f6-a6f34016dfbf/nmstate-operator/0.log" Mar 12 12:54:43.791565 master-0 kubenswrapper[13984]: I0312 12:54:43.790280 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-zhp64_a379bead-8afc-458e-90cf-5859fdd27e8c/nmstate-webhook/0.log" Mar 12 12:54:45.148525 master-0 kubenswrapper[13984]: I0312 12:54:45.148019 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bvl5s_57f59392-f9b6-4b99-ad37-33c5b7c04adb/controller/0.log" Mar 12 12:54:45.163740 master-0 kubenswrapper[13984]: I0312 12:54:45.160693 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bvl5s_57f59392-f9b6-4b99-ad37-33c5b7c04adb/kube-rbac-proxy/0.log" Mar 12 12:54:45.194105 master-0 kubenswrapper[13984]: I0312 12:54:45.193019 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/controller/0.log" Mar 12 12:54:46.223983 master-0 kubenswrapper[13984]: I0312 12:54:46.220333 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/frr/0.log" Mar 12 12:54:46.235816 master-0 kubenswrapper[13984]: I0312 12:54:46.233251 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/reloader/0.log" Mar 12 12:54:46.243508 master-0 kubenswrapper[13984]: I0312 12:54:46.241939 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/frr-metrics/0.log" Mar 12 12:54:46.252530 master-0 kubenswrapper[13984]: I0312 12:54:46.250711 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/kube-rbac-proxy/0.log" Mar 12 12:54:46.262520 master-0 kubenswrapper[13984]: I0312 12:54:46.260988 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/kube-rbac-proxy-frr/0.log" Mar 12 12:54:46.279872 master-0 kubenswrapper[13984]: I0312 12:54:46.279078 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/cp-frr-files/0.log" Mar 12 12:54:46.292041 master-0 kubenswrapper[13984]: I0312 12:54:46.290583 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/cp-reloader/0.log" Mar 12 12:54:46.299237 master-0 kubenswrapper[13984]: I0312 12:54:46.298547 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/cp-metrics/0.log" Mar 12 12:54:46.319694 master-0 kubenswrapper[13984]: I0312 12:54:46.319652 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-c6n7v_2ccd308f-4d04-477a-9671-ced98cf499c1/frr-k8s-webhook-server/0.log" Mar 12 12:54:46.355632 master-0 kubenswrapper[13984]: I0312 12:54:46.355347 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-787f7977d6-w89p5_a16471dc-b7aa-43ed-876d-895680bd9539/manager/0.log" Mar 12 12:54:46.383162 master-0 kubenswrapper[13984]: I0312 12:54:46.383121 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56d7584dd9-8sxrm_c0d7dfa9-7424-491d-b0a0-3543e41efe0b/webhook-server/0.log" Mar 12 12:54:46.749347 master-0 kubenswrapper[13984]: I0312 12:54:46.743647 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcdctl/0.log" Mar 12 12:54:46.867730 master-0 kubenswrapper[13984]: I0312 12:54:46.867643 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtt27_c6b5193f-85dd-431d-8de7-27731f2cefd4/speaker/0.log" Mar 12 12:54:46.879958 master-0 kubenswrapper[13984]: I0312 12:54:46.879906 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtt27_c6b5193f-85dd-431d-8de7-27731f2cefd4/kube-rbac-proxy/0.log" Mar 12 12:54:47.075492 master-0 kubenswrapper[13984]: I0312 12:54:47.073506 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd/0.log" Mar 12 12:54:47.102292 master-0 kubenswrapper[13984]: I0312 12:54:47.102245 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-metrics/0.log" Mar 12 12:54:47.128138 master-0 kubenswrapper[13984]: I0312 12:54:47.125584 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-readyz/0.log" Mar 12 12:54:47.161751 master-0 kubenswrapper[13984]: I0312 12:54:47.161391 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-rev/0.log" Mar 12 12:54:47.183983 master-0 kubenswrapper[13984]: I0312 12:54:47.183935 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/setup/0.log" Mar 12 12:54:47.204553 master-0 kubenswrapper[13984]: I0312 12:54:47.204515 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-ensure-env-vars/0.log" Mar 12 12:54:47.226337 master-0 kubenswrapper[13984]: I0312 12:54:47.226297 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-resources-copy/0.log" Mar 12 12:54:47.275776 master-0 kubenswrapper[13984]: I0312 12:54:47.275410 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1/installer/0.log" Mar 12 12:54:47.337076 master-0 kubenswrapper[13984]: I0312 12:54:47.336957 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_51ef1ec5-3e17-485a-9797-566ef207fa0a/installer/0.log" Mar 12 12:54:49.050697 master-0 kubenswrapper[13984]: I0312 12:54:49.050640 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-556cc97fc4-pjrv5_38ef5c51-5afe-44c6-bfb5-5f80002f72fd/oauth-openshift/0.log" Mar 12 12:54:49.288264 master-0 kubenswrapper[13984]: I0312 12:54:49.288223 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/assisted-installer_assisted-installer-controller-p2nlp_33be2f5b-c837-4a07-8ad9-4400a36f53c1/assisted-installer-controller/0.log" Mar 12 12:54:49.385711 master-0 kubenswrapper[13984]: I0312 12:54:49.385649 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kbz5s/master-0-debug-hsxkm"] Mar 12 12:54:49.388267 master-0 kubenswrapper[13984]: I0312 12:54:49.387974 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.486884 master-0 kubenswrapper[13984]: I0312 12:54:49.486814 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lnvc\" (UniqueName: \"kubernetes.io/projected/cf34bab5-1075-4d85-b08a-1d4ddd190cc0-kube-api-access-8lnvc\") pod \"master-0-debug-hsxkm\" (UID: \"cf34bab5-1075-4d85-b08a-1d4ddd190cc0\") " pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.486884 master-0 kubenswrapper[13984]: I0312 12:54:49.486883 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf34bab5-1075-4d85-b08a-1d4ddd190cc0-host\") pod \"master-0-debug-hsxkm\" (UID: \"cf34bab5-1075-4d85-b08a-1d4ddd190cc0\") " pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.590119 master-0 kubenswrapper[13984]: I0312 12:54:49.590055 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lnvc\" (UniqueName: \"kubernetes.io/projected/cf34bab5-1075-4d85-b08a-1d4ddd190cc0-kube-api-access-8lnvc\") pod \"master-0-debug-hsxkm\" (UID: \"cf34bab5-1075-4d85-b08a-1d4ddd190cc0\") " pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.590119 master-0 kubenswrapper[13984]: I0312 12:54:49.590121 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf34bab5-1075-4d85-b08a-1d4ddd190cc0-host\") pod \"master-0-debug-hsxkm\" (UID: \"cf34bab5-1075-4d85-b08a-1d4ddd190cc0\") " pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.590487 master-0 kubenswrapper[13984]: I0312 12:54:49.590424 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cf34bab5-1075-4d85-b08a-1d4ddd190cc0-host\") pod \"master-0-debug-hsxkm\" (UID: \"cf34bab5-1075-4d85-b08a-1d4ddd190cc0\") " pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.608706 master-0 kubenswrapper[13984]: I0312 12:54:49.608594 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lnvc\" (UniqueName: \"kubernetes.io/projected/cf34bab5-1075-4d85-b08a-1d4ddd190cc0-kube-api-access-8lnvc\") pod \"master-0-debug-hsxkm\" (UID: \"cf34bab5-1075-4d85-b08a-1d4ddd190cc0\") " pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.711163 master-0 kubenswrapper[13984]: I0312 12:54:49.710625 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" Mar 12 12:54:49.837551 master-0 kubenswrapper[13984]: I0312 12:54:49.837016 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" event={"ID":"cf34bab5-1075-4d85-b08a-1d4ddd190cc0","Type":"ContainerStarted","Data":"2543152b70dbe72ae5c343636c73ef57efbf2ab60e0cf6eea0cde382cb74a432"} Mar 12 12:54:50.872422 master-0 kubenswrapper[13984]: I0312 12:54:50.871447 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-98xjv_a346ac54-02fe-417f-a49d-038e45b13a1d/authentication-operator/1.log" Mar 12 12:54:50.872422 master-0 kubenswrapper[13984]: I0312 12:54:50.872058 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-98xjv_a346ac54-02fe-417f-a49d-038e45b13a1d/authentication-operator/0.log" Mar 12 12:54:51.942816 master-0 kubenswrapper[13984]: I0312 12:54:51.942764 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-74a2-account-create-update-xbbbk_6d81f389-a57f-4baa-9df7-21072a39b775/mariadb-account-create-update/0.log" Mar 12 12:54:52.065325 master-0 kubenswrapper[13984]: I0312 12:54:52.064715 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-api-0_5fcc18ae-60e7-48a2-b6ef-9c6d98f38757/cinder-8c9c7-api-log/0.log" Mar 12 12:54:52.083555 master-0 kubenswrapper[13984]: I0312 12:54:52.083455 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-api-0_5fcc18ae-60e7-48a2-b6ef-9c6d98f38757/cinder-api/0.log" Mar 12 12:54:52.158020 master-0 kubenswrapper[13984]: I0312 12:54:52.157977 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-backup-0_4648a719-c73b-41e4-9409-ffc011468cf1/cinder-backup/0.log" Mar 12 12:54:52.170711 master-0 kubenswrapper[13984]: I0312 12:54:52.170656 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-backup-0_4648a719-c73b-41e4-9409-ffc011468cf1/probe/0.log" Mar 12 12:54:52.189184 master-0 kubenswrapper[13984]: I0312 12:54:52.187750 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-db-sync-rcknr_c180ef3d-38aa-4843-b682-92e609e6d856/cinder-8c9c7-db-sync/0.log" Mar 12 12:54:52.241469 master-0 kubenswrapper[13984]: I0312 12:54:52.241298 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79f8cd6fdd-pczct_f7e806c0-793c-42ba-91cb-12d64ba841d3/router/0.log" Mar 12 12:54:52.292945 master-0 kubenswrapper[13984]: I0312 12:54:52.292892 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-scheduler-0_bffc1e39-45d1-496f-b4af-8083ec87dabf/cinder-scheduler/0.log" Mar 12 12:54:52.307284 master-0 kubenswrapper[13984]: I0312 12:54:52.307092 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-scheduler-0_bffc1e39-45d1-496f-b4af-8083ec87dabf/probe/0.log" Mar 12 12:54:52.379063 master-0 kubenswrapper[13984]: I0312 12:54:52.379018 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-volume-lvm-iscsi-0_e688c798-286e-49cf-93f7-9c95e822b759/cinder-volume/0.log" Mar 12 12:54:52.391027 master-0 kubenswrapper[13984]: I0312 12:54:52.390977 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-8c9c7-volume-lvm-iscsi-0_e688c798-286e-49cf-93f7-9c95e822b759/probe/0.log" Mar 12 12:54:52.405498 master-0 kubenswrapper[13984]: I0312 12:54:52.402799 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-db-create-cpjvq_6567de78-8717-4214-8b7e-cd66e2ca2a24/mariadb-database-create/0.log" Mar 12 12:54:52.422960 master-0 kubenswrapper[13984]: I0312 12:54:52.422840 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6fd99d4b97-wqbt9_722fad77-eaea-4605-b6a6-58ece6b90875/dnsmasq-dns/0.log" Mar 12 12:54:52.433994 master-0 kubenswrapper[13984]: I0312 12:54:52.433627 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6fd99d4b97-wqbt9_722fad77-eaea-4605-b6a6-58ece6b90875/init/0.log" Mar 12 12:54:52.452256 master-0 kubenswrapper[13984]: I0312 12:54:52.452205 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-2374-account-create-update-thf95_b671c7b6-86a7-4134-a072-70b23814f541/mariadb-account-create-update/0.log" Mar 12 12:54:52.487011 master-0 kubenswrapper[13984]: I0312 12:54:52.486912 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-create-hncbq_68a41ec0-ae93-43bf-97a1-acda5b50ee55/mariadb-database-create/0.log" Mar 12 12:54:52.520207 master-0 kubenswrapper[13984]: I0312 12:54:52.520096 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-db-sync-zt2nr_f56d283a-8eb6-4927-bf3c-3145cc96ee28/glance-db-sync/0.log" Mar 12 12:54:52.613076 master-0 kubenswrapper[13984]: I0312 12:54:52.613027 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-f98a5-default-external-api-0_dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910/glance-log/0.log" Mar 12 12:54:52.695730 master-0 kubenswrapper[13984]: I0312 12:54:52.694554 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-f98a5-default-external-api-0_dfa7bcfe-114b-4fe9-ac5c-c8c480d0d910/glance-httpd/0.log" Mar 12 12:54:52.905244 master-0 kubenswrapper[13984]: I0312 12:54:52.904664 13984 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk"] Mar 12 12:54:52.908586 master-0 kubenswrapper[13984]: I0312 12:54:52.907650 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:52.979041 master-0 kubenswrapper[13984]: I0312 12:54:52.978987 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk"] Mar 12 12:54:53.000395 master-0 kubenswrapper[13984]: I0312 12:54:53.000306 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-f98a5-default-internal-api-0_626c2e41-499c-4882-88e8-7d64d647be9e/glance-log/0.log" Mar 12 12:54:53.017542 master-0 kubenswrapper[13984]: I0312 12:54:53.016839 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-f98a5-default-internal-api-0_626c2e41-499c-4882-88e8-7d64d647be9e/glance-httpd/0.log" Mar 12 12:54:53.035094 master-0 kubenswrapper[13984]: I0312 12:54:53.034944 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-podres\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.035329 master-0 kubenswrapper[13984]: I0312 12:54:53.035143 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-lib-modules\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.036168 master-0 kubenswrapper[13984]: I0312 12:54:53.036059 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-sys\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.036168 master-0 kubenswrapper[13984]: I0312 12:54:53.036105 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdmd7\" (UniqueName: \"kubernetes.io/projected/6497aa2e-74d0-4542-a102-a12953839b2e-kube-api-access-xdmd7\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.036496 master-0 kubenswrapper[13984]: I0312 12:54:53.036309 13984 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-proc\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.036496 master-0 kubenswrapper[13984]: I0312 12:54:53.036362 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-5bc96c7dbf-x46lx_431f12da-d4a8-4748-87d1-4ffdd0b1ecc4/ironic-api-log/0.log" Mar 12 12:54:53.063861 master-0 kubenswrapper[13984]: I0312 12:54:53.063525 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-5bc96c7dbf-x46lx_431f12da-d4a8-4748-87d1-4ffdd0b1ecc4/ironic-api/0.log" Mar 12 12:54:53.080817 master-0 kubenswrapper[13984]: I0312 12:54:53.080763 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-5bc96c7dbf-x46lx_431f12da-d4a8-4748-87d1-4ffdd0b1ecc4/init/0.log" Mar 12 12:54:53.110377 master-0 kubenswrapper[13984]: I0312 12:54:53.110335 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_43e6fb75-b813-4074-8029-d6817b1bb9e2/ironic-conductor/0.log" Mar 12 12:54:53.128389 master-0 kubenswrapper[13984]: I0312 12:54:53.128337 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_43e6fb75-b813-4074-8029-d6817b1bb9e2/httpboot/0.log" Mar 12 12:54:53.139715 master-0 kubenswrapper[13984]: I0312 12:54:53.139651 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-sys\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.139715 master-0 kubenswrapper[13984]: I0312 12:54:53.139720 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdmd7\" (UniqueName: \"kubernetes.io/projected/6497aa2e-74d0-4542-a102-a12953839b2e-kube-api-access-xdmd7\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.140001 master-0 kubenswrapper[13984]: I0312 12:54:53.139801 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-proc\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.140001 master-0 kubenswrapper[13984]: I0312 12:54:53.139923 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-podres\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.140089 master-0 kubenswrapper[13984]: I0312 12:54:53.140032 13984 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-lib-modules\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.143202 master-0 kubenswrapper[13984]: I0312 12:54:53.143143 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-sys\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.143338 master-0 kubenswrapper[13984]: I0312 12:54:53.143266 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-proc\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.143338 master-0 kubenswrapper[13984]: I0312 12:54:53.143325 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-podres\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.143761 master-0 kubenswrapper[13984]: I0312 12:54:53.143729 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6497aa2e-74d0-4542-a102-a12953839b2e-lib-modules\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.158564 master-0 kubenswrapper[13984]: I0312 12:54:53.158402 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_43e6fb75-b813-4074-8029-d6817b1bb9e2/dnsmasq/0.log" Mar 12 12:54:53.165320 master-0 kubenswrapper[13984]: I0312 12:54:53.165244 13984 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdmd7\" (UniqueName: \"kubernetes.io/projected/6497aa2e-74d0-4542-a102-a12953839b2e-kube-api-access-xdmd7\") pod \"perf-node-gather-daemonset-qhbwk\" (UID: \"6497aa2e-74d0-4542-a102-a12953839b2e\") " pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.179882 master-0 kubenswrapper[13984]: I0312 12:54:53.179833 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_43e6fb75-b813-4074-8029-d6817b1bb9e2/init/0.log" Mar 12 12:54:53.213975 master-0 kubenswrapper[13984]: I0312 12:54:53.213892 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_43e6fb75-b813-4074-8029-d6817b1bb9e2/ironic-python-agent-init/0.log" Mar 12 12:54:53.241505 master-0 kubenswrapper[13984]: I0312 12:54:53.241001 13984 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:53.625438 master-0 kubenswrapper[13984]: I0312 12:54:53.625368 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-56c75bf4c7-5rbw4_720101f1-0833-45af-a5b7-4910ece2a589/oauth-apiserver/0.log" Mar 12 12:54:53.655500 master-0 kubenswrapper[13984]: I0312 12:54:53.652660 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-56c75bf4c7-5rbw4_720101f1-0833-45af-a5b7-4910ece2a589/fix-audit-permissions/0.log" Mar 12 12:54:53.891261 master-0 kubenswrapper[13984]: I0312 12:54:53.890919 13984 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk"] Mar 12 12:54:53.922890 master-0 kubenswrapper[13984]: I0312 12:54:53.922788 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" event={"ID":"6497aa2e-74d0-4542-a102-a12953839b2e","Type":"ContainerStarted","Data":"8edcebfe82e63b231becf067d0a40ae01b81a32f85590ac8dfc5e2b7ac9e9e14"} Mar 12 12:54:54.242216 master-0 kubenswrapper[13984]: I0312 12:54:54.242164 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-conductor-0_43e6fb75-b813-4074-8029-d6817b1bb9e2/pxe-init/0.log" Mar 12 12:54:54.255360 master-0 kubenswrapper[13984]: I0312 12:54:54.255238 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-create-fswww_149d48ba-e4d1-4684-ae20-f17d4e1f247f/mariadb-database-create/0.log" Mar 12 12:54:54.287500 master-0 kubenswrapper[13984]: I0312 12:54:54.284120 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-ftw7v_27bcfca5-8ad8-4bc5-b95b-0629c699b6e3/ironic-db-sync/0.log" Mar 12 12:54:54.305154 master-0 kubenswrapper[13984]: I0312 12:54:54.304300 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-db-sync-ftw7v_27bcfca5-8ad8-4bc5-b95b-0629c699b6e3/init/0.log" Mar 12 12:54:54.315744 master-0 kubenswrapper[13984]: I0312 12:54:54.315653 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-df3c-account-create-update-rbfjs_f92e833d-9f24-4783-a5d6-89a7fa102026/mariadb-account-create-update/0.log" Mar 12 12:54:54.377841 master-0 kubenswrapper[13984]: I0312 12:54:54.377789 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/ironic-inspector-httpd/0.log" Mar 12 12:54:54.398566 master-0 kubenswrapper[13984]: I0312 12:54:54.398472 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/ironic-inspector/0.log" Mar 12 12:54:54.408388 master-0 kubenswrapper[13984]: I0312 12:54:54.408343 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/inspector-httpboot/0.log" Mar 12 12:54:54.420503 master-0 kubenswrapper[13984]: I0312 12:54:54.418466 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/ramdisk-logs/0.log" Mar 12 12:54:54.432947 master-0 kubenswrapper[13984]: I0312 12:54:54.432675 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/inspector-dnsmasq/0.log" Mar 12 12:54:54.442507 master-0 kubenswrapper[13984]: I0312 12:54:54.439234 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/ironic-python-agent-init/0.log" Mar 12 12:54:54.457205 master-0 kubenswrapper[13984]: I0312 12:54:54.457156 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-0_61638247-2447-4459-8be8-f91c2e718b46/inspector-pxe-init/0.log" Mar 12 12:54:54.464998 master-0 kubenswrapper[13984]: I0312 12:54:54.464948 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-7750-account-create-update-mx9br_4f28f93e-f049-44fe-b721-ff4ad88db2b0/mariadb-account-create-update/0.log" Mar 12 12:54:54.485497 master-0 kubenswrapper[13984]: I0312 12:54:54.480553 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-db-create-n5b5j_9990518c-8209-4d3a-aad6-130834718172/mariadb-database-create/0.log" Mar 12 12:54:54.497561 master-0 kubenswrapper[13984]: I0312 12:54:54.493820 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-inspector-db-sync-6crfp_5bd44f3d-5677-4edd-8e29-8f010f3bfed1/ironic-inspector-db-sync/0.log" Mar 12 12:54:54.525407 master-0 kubenswrapper[13984]: I0312 12:54:54.525356 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-776949857-qhnzl_4cecbed0-f638-495f-a450-ddcb64f6cc30/ironic-neutron-agent/2.log" Mar 12 12:54:54.527711 master-0 kubenswrapper[13984]: I0312 12:54:54.527687 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ironic-neutron-agent-776949857-qhnzl_4cecbed0-f638-495f-a450-ddcb64f6cc30/ironic-neutron-agent/1.log" Mar 12 12:54:54.595089 master-0 kubenswrapper[13984]: I0312 12:54:54.595040 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7d857f7b77-g6ptr_f4c65284-94b1-4a57-9953-0b5f19554376/keystone-api/0.log" Mar 12 12:54:54.607996 master-0 kubenswrapper[13984]: I0312 12:54:54.607957 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bootstrap-p7rh2_dc6207a9-cfba-4912-9ee2-7ae929ac28bd/keystone-bootstrap/0.log" Mar 12 12:54:54.623503 master-0 kubenswrapper[13984]: I0312 12:54:54.619869 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-d62b-account-create-update-lr46l_b6f71885-a3f0-4897-bc6c-bfd657a35108/mariadb-account-create-update/0.log" Mar 12 12:54:54.627822 master-0 kubenswrapper[13984]: I0312 12:54:54.627780 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-create-fhfwn_2635bca8-9408-45c1-b88c-3634684d244a/mariadb-database-create/0.log" Mar 12 12:54:54.640460 master-0 kubenswrapper[13984]: I0312 12:54:54.640427 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-db-sync-6rxfx_e6e063ff-c1ba-4c47-ba35-c3ee0194dae6/keystone-db-sync/0.log" Mar 12 12:54:54.728116 master-0 kubenswrapper[13984]: I0312 12:54:54.728073 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/kube-rbac-proxy/0.log" Mar 12 12:54:54.770327 master-0 kubenswrapper[13984]: I0312 12:54:54.770219 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/3.log" Mar 12 12:54:54.786511 master-0 kubenswrapper[13984]: I0312 12:54:54.784672 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-ph7gk_632651f7-6641-49d8-9c48-7f6ea5846538/cluster-autoscaler-operator/4.log" Mar 12 12:54:54.816117 master-0 kubenswrapper[13984]: I0312 12:54:54.815990 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/4.log" Mar 12 12:54:54.816700 master-0 kubenswrapper[13984]: I0312 12:54:54.816677 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/cluster-baremetal-operator/5.log" Mar 12 12:54:54.838960 master-0 kubenswrapper[13984]: I0312 12:54:54.838907 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-pb97p_c7d2a100-a24a-4ae6-bd8e-4530163a3ffe/baremetal-kube-rbac-proxy/0.log" Mar 12 12:54:54.869670 master-0 kubenswrapper[13984]: I0312 12:54:54.869630 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4_c62edaec-38e2-4b73-8bb5-c776abfb310f/control-plane-machine-set-operator/1.log" Mar 12 12:54:54.871893 master-0 kubenswrapper[13984]: I0312 12:54:54.871785 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dnxx4_c62edaec-38e2-4b73-8bb5-c776abfb310f/control-plane-machine-set-operator/0.log" Mar 12 12:54:54.907101 master-0 kubenswrapper[13984]: I0312 12:54:54.906883 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-nq8zw_2bab9dba-235f-467c-9224-634cca9acbd2/kube-rbac-proxy/0.log" Mar 12 12:54:54.923325 master-0 kubenswrapper[13984]: I0312 12:54:54.923265 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-nq8zw_2bab9dba-235f-467c-9224-634cca9acbd2/machine-api-operator/0.log" Mar 12 12:54:54.924881 master-0 kubenswrapper[13984]: I0312 12:54:54.924849 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-nq8zw_2bab9dba-235f-467c-9224-634cca9acbd2/machine-api-operator/1.log" Mar 12 12:54:54.964516 master-0 kubenswrapper[13984]: I0312 12:54:54.964383 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" event={"ID":"6497aa2e-74d0-4542-a102-a12953839b2e","Type":"ContainerStarted","Data":"eeb67583dd3b0af17ab911a9237eb81780cd49b27264ec84509aad9cef75855a"} Mar 12 12:54:54.964892 master-0 kubenswrapper[13984]: I0312 12:54:54.964855 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:54:54.983519 master-0 kubenswrapper[13984]: I0312 12:54:54.982561 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" podStartSLOduration=2.982540502 podStartE2EDuration="2.982540502s" podCreationTimestamp="2026-03-12 12:54:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-12 12:54:54.982290705 +0000 UTC m=+1827.180306207" watchObservedRunningTime="2026-03-12 12:54:54.982540502 +0000 UTC m=+1827.180555994" Mar 12 12:54:57.064243 master-0 kubenswrapper[13984]: I0312 12:54:57.064197 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/cluster-cloud-controller-manager/0.log" Mar 12 12:54:57.064906 master-0 kubenswrapper[13984]: I0312 12:54:57.064792 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/cluster-cloud-controller-manager/1.log" Mar 12 12:54:57.091454 master-0 kubenswrapper[13984]: I0312 12:54:57.091412 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/1.log" Mar 12 12:54:57.092419 master-0 kubenswrapper[13984]: I0312 12:54:57.092130 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/config-sync-controllers/2.log" Mar 12 12:54:57.121019 master-0 kubenswrapper[13984]: I0312 12:54:57.120957 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-pvjft_81cb0504-9455-4398-aed1-5cc6790f292e/kube-rbac-proxy/0.log" Mar 12 12:54:59.986817 master-0 kubenswrapper[13984]: I0312 12:54:59.984306 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-55d85b7b47-gz7ll_02d5a507-4409-44b4-98bc-1751cdcc6c6a/kube-rbac-proxy/0.log" Mar 12 12:55:00.013894 master-0 kubenswrapper[13984]: I0312 12:55:00.013839 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-credential-operator_cloud-credential-operator-55d85b7b47-gz7ll_02d5a507-4409-44b4-98bc-1751cdcc6c6a/cloud-credential-operator/0.log" Mar 12 12:55:02.365881 master-0 kubenswrapper[13984]: I0312 12:55:02.365830 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-fg5mg_a154f648-b96d-449e-b0f5-ba32266000c2/openshift-config-operator/2.log" Mar 12 12:55:02.366558 master-0 kubenswrapper[13984]: I0312 12:55:02.366072 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-fg5mg_a154f648-b96d-449e-b0f5-ba32266000c2/openshift-config-operator/1.log" Mar 12 12:55:02.381627 master-0 kubenswrapper[13984]: I0312 12:55:02.381250 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-fg5mg_a154f648-b96d-449e-b0f5-ba32266000c2/openshift-api/0.log" Mar 12 12:55:02.590752 master-0 kubenswrapper[13984]: I0312 12:55:02.590705 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_65a4af9d-8221-42be-b78f-ada6b347b337/memcached/0.log" Mar 12 12:55:02.600409 master-0 kubenswrapper[13984]: I0312 12:55:02.599616 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-16d8-account-create-update-lp6v6_f2eefae0-724a-451b-b39b-753fa2551ba1/mariadb-account-create-update/0.log" Mar 12 12:55:02.750396 master-0 kubenswrapper[13984]: I0312 12:55:02.750336 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dc97885fc-g6g5j_f399c312-60ae-457b-b49e-481c112547d1/neutron-api/0.log" Mar 12 12:55:02.763601 master-0 kubenswrapper[13984]: I0312 12:55:02.763549 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6dc97885fc-g6g5j_f399c312-60ae-457b-b49e-481c112547d1/neutron-httpd/0.log" Mar 12 12:55:02.787775 master-0 kubenswrapper[13984]: I0312 12:55:02.786810 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-create-zfdwn_eb892f56-bc0f-4500-903c-49c41d16945b/mariadb-database-create/0.log" Mar 12 12:55:02.807544 master-0 kubenswrapper[13984]: I0312 12:55:02.807502 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-db-sync-dfnvv_d2a33401-2a04-4b67-ad4f-702034d7a0a6/neutron-db-sync/0.log" Mar 12 12:55:02.894255 master-0 kubenswrapper[13984]: I0312 12:55:02.892458 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_975b92c1-fe1e-48a0-97a5-d9c4a520166a/nova-api-log/0.log" Mar 12 12:55:02.968150 master-0 kubenswrapper[13984]: I0312 12:55:02.968110 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_975b92c1-fe1e-48a0-97a5-d9c4a520166a/nova-api-api/0.log" Mar 12 12:55:02.976874 master-0 kubenswrapper[13984]: I0312 12:55:02.976832 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-162c-account-create-update-jzxl9_afbeb3b5-acd9-4095-bc17-29c1b6a29960/mariadb-account-create-update/0.log" Mar 12 12:55:02.988833 master-0 kubenswrapper[13984]: I0312 12:55:02.986628 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-db-create-qfsbw_89d18698-6635-4aba-a43e-eaf81cf72796/mariadb-database-create/0.log" Mar 12 12:55:02.996171 master-0 kubenswrapper[13984]: I0312 12:55:02.996064 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-642a-account-create-update-lwl6p_c00d2711-454c-4bc3-9db9-55fa8537ecad/mariadb-account-create-update/0.log" Mar 12 12:55:03.009406 master-0 kubenswrapper[13984]: I0312 12:55:03.008722 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-cell-mapping-vd6w8_ddc6b5d3-5d22-463c-b9cf-45a8a8e2be0e/nova-manage/0.log" Mar 12 12:55:03.634263 master-0 kubenswrapper[13984]: I0312 12:55:03.634208 13984 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kbz5s/perf-node-gather-daemonset-qhbwk" Mar 12 12:55:03.729254 master-0 kubenswrapper[13984]: I0312 12:55:03.727799 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a9ed2b2f-ae45-4aaa-b4ac-328625f9b852/nova-cell0-conductor-conductor/0.log" Mar 12 12:55:03.757308 master-0 kubenswrapper[13984]: I0312 12:55:03.757263 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-db-sync-cd96k_fc8b81ae-1976-4abe-8d82-4decd232dd98/nova-cell0-conductor-db-sync/0.log" Mar 12 12:55:03.764965 master-0 kubenswrapper[13984]: I0312 12:55:03.764915 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-db-create-tkj85_413ffd1a-0d7d-4ee6-8868-1828490695d6/mariadb-database-create/0.log" Mar 12 12:55:03.783929 master-0 kubenswrapper[13984]: I0312 12:55:03.783875 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-0b4a-account-create-update-4zbjs_7180a2fd-8928-4b52-a707-a08eb7230a3b/mariadb-account-create-update/0.log" Mar 12 12:55:03.798184 master-0 kubenswrapper[13984]: I0312 12:55:03.797949 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-cell-mapping-52nsz_966ac026-07e7-4214-9c35-942fcd250899/nova-manage/0.log" Mar 12 12:55:03.871370 master-0 kubenswrapper[13984]: I0312 12:55:03.871319 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-compute-ironic-compute-0_f6d95490-fdae-4ecb-961d-553a7fda1436/nova-cell1-compute-ironic-compute-compute/0.log" Mar 12 12:55:03.964767 master-0 kubenswrapper[13984]: I0312 12:55:03.964468 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_67f9b44e-10c5-46d9-8091-006ca18b30ac/nova-cell1-conductor-conductor/0.log" Mar 12 12:55:03.979619 master-0 kubenswrapper[13984]: I0312 12:55:03.979566 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-db-sync-rf48n_ea457152-df63-450a-be0c-aaca7e72b0b4/nova-cell1-conductor-db-sync/0.log" Mar 12 12:55:03.994377 master-0 kubenswrapper[13984]: I0312 12:55:03.994311 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-db-create-nchvw_9cba6f06-5a3b-4ebc-b36b-779fb17b727b/mariadb-database-create/0.log" Mar 12 12:55:04.007415 master-0 kubenswrapper[13984]: I0312 12:55:04.007291 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-host-discover-j4k72_bc21a66e-84d4-4147-a050-e03b4cfc5ddd/nova-manage/0.log" Mar 12 12:55:04.072197 master-0 kubenswrapper[13984]: I0312 12:55:04.071713 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_6b2ee394-60ab-4e16-8d45-7992a2ba039d/nova-cell1-novncproxy-novncproxy/0.log" Mar 12 12:55:04.177797 master-0 kubenswrapper[13984]: I0312 12:55:04.177742 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_954b0e9a-51a8-42d4-b768-c3ac8407ac2f/nova-metadata-log/0.log" Mar 12 12:55:04.277829 master-0 kubenswrapper[13984]: I0312 12:55:04.276880 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_954b0e9a-51a8-42d4-b768-c3ac8407ac2f/nova-metadata-metadata/0.log" Mar 12 12:55:04.342550 master-0 kubenswrapper[13984]: I0312 12:55:04.342411 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-6c7fb6b958-rnnjn_c424f946-e2fe-4450-816b-b79640269ff5/console-operator/0.log" Mar 12 12:55:04.378488 master-0 kubenswrapper[13984]: I0312 12:55:04.377975 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_be5e9fc6-888e-41b0-b034-cf71c4f47610/nova-scheduler-scheduler/0.log" Mar 12 12:55:04.397871 master-0 kubenswrapper[13984]: I0312 12:55:04.397823 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb5b76fd-5724-4d55-94b6-3c071262be24/galera/0.log" Mar 12 12:55:04.413818 master-0 kubenswrapper[13984]: I0312 12:55:04.413784 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_cb5b76fd-5724-4d55-94b6-3c071262be24/mysql-bootstrap/0.log" Mar 12 12:55:04.437716 master-0 kubenswrapper[13984]: I0312 12:55:04.437659 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ea7532b5-e173-4190-a81d-a1e0d3bcd824/galera/0.log" Mar 12 12:55:04.448080 master-0 kubenswrapper[13984]: I0312 12:55:04.448024 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ea7532b5-e173-4190-a81d-a1e0d3bcd824/mysql-bootstrap/0.log" Mar 12 12:55:04.456523 master-0 kubenswrapper[13984]: I0312 12:55:04.456444 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1f0d75f9-805b-4886-9bff-8544337a2fd1/openstackclient/0.log" Mar 12 12:55:04.470092 master-0 kubenswrapper[13984]: I0312 12:55:04.470021 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-7dvg9_f5cd50ae-2194-4717-96f0-47b3d353c8b1/ovn-controller/0.log" Mar 12 12:55:04.483304 master-0 kubenswrapper[13984]: I0312 12:55:04.480840 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-bjdfk_95386842-d198-447d-8b98-f74826151e8b/openstack-network-exporter/0.log" Mar 12 12:55:04.496502 master-0 kubenswrapper[13984]: I0312 12:55:04.495968 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vttwx_095ac834-8b0e-490f-b39f-c97135436fab/ovsdb-server/0.log" Mar 12 12:55:04.510677 master-0 kubenswrapper[13984]: I0312 12:55:04.510628 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vttwx_095ac834-8b0e-490f-b39f-c97135436fab/ovs-vswitchd/0.log" Mar 12 12:55:04.522357 master-0 kubenswrapper[13984]: I0312 12:55:04.522298 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-vttwx_095ac834-8b0e-490f-b39f-c97135436fab/ovsdb-server-init/0.log" Mar 12 12:55:04.534579 master-0 kubenswrapper[13984]: I0312 12:55:04.532981 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8c279ec7-a696-4020-a396-bda3ce501266/ovn-northd/0.log" Mar 12 12:55:04.540501 master-0 kubenswrapper[13984]: I0312 12:55:04.539955 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_8c279ec7-a696-4020-a396-bda3ce501266/openstack-network-exporter/0.log" Mar 12 12:55:04.570834 master-0 kubenswrapper[13984]: I0312 12:55:04.568609 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b6e60ab1-d624-4df2-a3e8-a9044c72edfc/ovsdbserver-nb/0.log" Mar 12 12:55:04.579818 master-0 kubenswrapper[13984]: I0312 12:55:04.579754 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_b6e60ab1-d624-4df2-a3e8-a9044c72edfc/openstack-network-exporter/0.log" Mar 12 12:55:04.606551 master-0 kubenswrapper[13984]: I0312 12:55:04.605913 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cb8736d5-b5e7-4ab6-9755-d295836ae7a4/ovsdbserver-sb/0.log" Mar 12 12:55:04.615728 master-0 kubenswrapper[13984]: I0312 12:55:04.615555 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_cb8736d5-b5e7-4ab6-9755-d295836ae7a4/openstack-network-exporter/0.log" Mar 12 12:55:04.630349 master-0 kubenswrapper[13984]: I0312 12:55:04.629721 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-1719-account-create-update-6wbfk_f9a1f634-3f7f-4470-838b-e675c4fb5c4a/mariadb-account-create-update/0.log" Mar 12 12:55:04.660122 master-0 kubenswrapper[13984]: I0312 12:55:04.660060 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6bdc8876c8-5m8cl_fafd3eff-630a-4ba3-b901-41d3de58bc54/placement-log/0.log" Mar 12 12:55:04.675366 master-0 kubenswrapper[13984]: I0312 12:55:04.675322 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-6bdc8876c8-5m8cl_fafd3eff-630a-4ba3-b901-41d3de58bc54/placement-api/0.log" Mar 12 12:55:04.683232 master-0 kubenswrapper[13984]: I0312 12:55:04.683199 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-create-7mhht_13933f72-2f21-4c9d-8d80-c5fbd0e9f94f/mariadb-database-create/0.log" Mar 12 12:55:04.695553 master-0 kubenswrapper[13984]: I0312 12:55:04.695499 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-db-sync-5pl65_c8085b96-6908-4956-9e4c-290797974f87/placement-db-sync/0.log" Mar 12 12:55:04.738537 master-0 kubenswrapper[13984]: I0312 12:55:04.737509 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83/rabbitmq/0.log" Mar 12 12:55:04.751398 master-0 kubenswrapper[13984]: I0312 12:55:04.748305 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_ae1ce187-98f0-4dc9-ba86-e18b4cfe6a83/setup-container/0.log" Mar 12 12:55:04.787495 master-0 kubenswrapper[13984]: I0312 12:55:04.787351 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f328e3e3-f9b4-4a88-9883-694d89c182f7/rabbitmq/0.log" Mar 12 12:55:04.794551 master-0 kubenswrapper[13984]: I0312 12:55:04.794497 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_f328e3e3-f9b4-4a88-9883-694d89c182f7/setup-container/0.log" Mar 12 12:55:04.808396 master-0 kubenswrapper[13984]: I0312 12:55:04.808348 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_root-account-create-update-vcdjd_8fc04a62-dfbf-401b-88ba-42dcf4acfaac/mariadb-account-create-update/0.log" Mar 12 12:55:04.839388 master-0 kubenswrapper[13984]: I0312 12:55:04.839228 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bcc965b68-5qb2v_0e02202f-67a8-46f0-a049-0f6b5d33358d/proxy-httpd/0.log" Mar 12 12:55:04.851926 master-0 kubenswrapper[13984]: I0312 12:55:04.851875 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-bcc965b68-5qb2v_0e02202f-67a8-46f0-a049-0f6b5d33358d/proxy-server/0.log" Mar 12 12:55:04.862940 master-0 kubenswrapper[13984]: I0312 12:55:04.862891 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-2mjxx_5767f9cc-96f2-4309-a6da-e89247924459/swift-ring-rebalance/0.log" Mar 12 12:55:04.910527 master-0 kubenswrapper[13984]: I0312 12:55:04.910466 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/account-server/0.log" Mar 12 12:55:04.927686 master-0 kubenswrapper[13984]: I0312 12:55:04.927621 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/account-replicator/0.log" Mar 12 12:55:04.938218 master-0 kubenswrapper[13984]: I0312 12:55:04.938186 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/account-auditor/0.log" Mar 12 12:55:04.950892 master-0 kubenswrapper[13984]: I0312 12:55:04.950787 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/account-reaper/0.log" Mar 12 12:55:04.966644 master-0 kubenswrapper[13984]: I0312 12:55:04.966553 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/container-server/0.log" Mar 12 12:55:04.992006 master-0 kubenswrapper[13984]: I0312 12:55:04.990561 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/container-replicator/0.log" Mar 12 12:55:04.996617 master-0 kubenswrapper[13984]: I0312 12:55:04.995877 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/container-auditor/0.log" Mar 12 12:55:05.006565 master-0 kubenswrapper[13984]: I0312 12:55:05.006312 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/container-updater/0.log" Mar 12 12:55:05.022528 master-0 kubenswrapper[13984]: I0312 12:55:05.022350 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/object-server/0.log" Mar 12 12:55:05.040680 master-0 kubenswrapper[13984]: I0312 12:55:05.039978 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/object-replicator/0.log" Mar 12 12:55:05.048282 master-0 kubenswrapper[13984]: I0312 12:55:05.048226 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/object-auditor/0.log" Mar 12 12:55:05.057182 master-0 kubenswrapper[13984]: I0312 12:55:05.057121 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/object-updater/0.log" Mar 12 12:55:05.066672 master-0 kubenswrapper[13984]: I0312 12:55:05.066623 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/object-expirer/0.log" Mar 12 12:55:05.082196 master-0 kubenswrapper[13984]: I0312 12:55:05.082154 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/rsync/0.log" Mar 12 12:55:05.091029 master-0 kubenswrapper[13984]: I0312 12:55:05.090874 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_92e2764a-7fca-4b4a-ae89-131f181cdeb9/swift-recon-cron/0.log" Mar 12 12:55:05.498311 master-0 kubenswrapper[13984]: I0312 12:55:05.498269 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-79c4496f46-rfmzb_e0ace1d1-a0ed-404b-8a89-117a6b179047/console/0.log" Mar 12 12:55:05.539950 master-0 kubenswrapper[13984]: I0312 12:55:05.539729 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-84f57b9877-t5sfc_aa78efd5-b1b1-4b64-8ece-480566fadbca/download-server/0.log" Mar 12 12:55:06.943641 master-0 kubenswrapper[13984]: I0312 12:55:06.943417 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-5q4fw_f3b704e7-1291-4645-8a0d-2a937829d7ac/cluster-storage-operator/0.log" Mar 12 12:55:06.949647 master-0 kubenswrapper[13984]: I0312 12:55:06.949538 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-5q4fw_f3b704e7-1291-4645-8a0d-2a937829d7ac/cluster-storage-operator/1.log" Mar 12 12:55:07.274325 master-0 kubenswrapper[13984]: I0312 12:55:07.274202 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/4.log" Mar 12 12:55:07.275102 master-0 kubenswrapper[13984]: I0312 12:55:07.275051 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kf7kw_7b3a7c4f-f48f-474c-b31f-cd556f9ed9ef/snapshot-controller/3.log" Mar 12 12:55:07.326564 master-0 kubenswrapper[13984]: I0312 12:55:07.325922 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-vmj4h_5a012d0b-d1a8-4cd3-8b91-b346d0445f24/csi-snapshot-controller-operator/1.log" Mar 12 12:55:07.333538 master-0 kubenswrapper[13984]: I0312 12:55:07.333448 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-operator-5685fbc7d-vmj4h_5a012d0b-d1a8-4cd3-8b91-b346d0445f24/csi-snapshot-controller-operator/0.log" Mar 12 12:55:08.751253 master-0 kubenswrapper[13984]: I0312 12:55:08.750105 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-589895fbb7-l8x6p_f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/dns-operator/0.log" Mar 12 12:55:08.772574 master-0 kubenswrapper[13984]: I0312 12:55:08.769633 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns-operator_dns-operator-589895fbb7-l8x6p_f3f295ac-7bc7-43b7-bd30-db82e7f16cd7/kube-rbac-proxy/0.log" Mar 12 12:55:10.427138 master-0 kubenswrapper[13984]: I0312 12:55:10.427084 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k8t84_9d47f860-d64a-49b8-b404-a67cbc2faeb6/dns/0.log" Mar 12 12:55:10.458528 master-0 kubenswrapper[13984]: I0312 12:55:10.458485 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-k8t84_9d47f860-d64a-49b8-b404-a67cbc2faeb6/kube-rbac-proxy/0.log" Mar 12 12:55:10.480568 master-0 kubenswrapper[13984]: I0312 12:55:10.480524 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-72w9q_36852fda-6aee-4a36-8724-537f1260c4c8/dns-node-resolver/0.log" Mar 12 12:55:11.756766 master-0 kubenswrapper[13984]: I0312 12:55:11.756705 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-7nb6b_ab087440-bdf2-4e2f-9a5a-434d50a2329a/etcd-operator/2.log" Mar 12 12:55:11.761388 master-0 kubenswrapper[13984]: I0312 12:55:11.761251 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-7nb6b_ab087440-bdf2-4e2f-9a5a-434d50a2329a/etcd-operator/1.log" Mar 12 12:55:12.225039 master-0 kubenswrapper[13984]: I0312 12:55:12.224984 13984 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" event={"ID":"cf34bab5-1075-4d85-b08a-1d4ddd190cc0","Type":"ContainerStarted","Data":"bb58af3d178f7feab3d8950866d0d9c8d644d710c9630e91e7ad2208b2ffe968"} Mar 12 12:55:12.252456 master-0 kubenswrapper[13984]: I0312 12:55:12.249445 13984 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kbz5s/master-0-debug-hsxkm" podStartSLOduration=1.07887869 podStartE2EDuration="23.249420645s" podCreationTimestamp="2026-03-12 12:54:49 +0000 UTC" firstStartedPulling="2026-03-12 12:54:49.784656332 +0000 UTC m=+1821.982671824" lastFinishedPulling="2026-03-12 12:55:11.955198297 +0000 UTC m=+1844.153213779" observedRunningTime="2026-03-12 12:55:12.242424601 +0000 UTC m=+1844.440440093" watchObservedRunningTime="2026-03-12 12:55:12.249420645 +0000 UTC m=+1844.447436137" Mar 12 12:55:12.618334 master-0 kubenswrapper[13984]: I0312 12:55:12.618251 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcdctl/0.log" Mar 12 12:55:12.874906 master-0 kubenswrapper[13984]: I0312 12:55:12.874782 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd/0.log" Mar 12 12:55:12.889280 master-0 kubenswrapper[13984]: I0312 12:55:12.889220 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-metrics/0.log" Mar 12 12:55:12.898570 master-0 kubenswrapper[13984]: I0312 12:55:12.898451 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-readyz/0.log" Mar 12 12:55:12.910967 master-0 kubenswrapper[13984]: I0312 12:55:12.910928 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-rev/0.log" Mar 12 12:55:12.924772 master-0 kubenswrapper[13984]: I0312 12:55:12.924726 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/setup/0.log" Mar 12 12:55:12.938090 master-0 kubenswrapper[13984]: I0312 12:55:12.938041 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-ensure-env-vars/0.log" Mar 12 12:55:12.948394 master-0 kubenswrapper[13984]: I0312 12:55:12.948341 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_29c709c82970b529e7b9b895aa92ef05/etcd-resources-copy/0.log" Mar 12 12:55:13.004124 master-0 kubenswrapper[13984]: I0312 12:55:13.004072 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_78e48de9-79eb-4b3c-bd18-aeeeadaaf5e1/installer/0.log" Mar 12 12:55:13.038730 master-0 kubenswrapper[13984]: I0312 12:55:13.038689 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_51ef1ec5-3e17-485a-9797-566ef207fa0a/installer/0.log" Mar 12 12:55:13.864270 master-0 kubenswrapper[13984]: I0312 12:55:13.864217 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-86d6d77c7c-kcnf4_cfd178d7-f518-413b-95ab-ab6687be6e0f/cluster-image-registry-operator/0.log" Mar 12 12:55:13.864821 master-0 kubenswrapper[13984]: I0312 12:55:13.864725 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_cluster-image-registry-operator-86d6d77c7c-kcnf4_cfd178d7-f518-413b-95ab-ab6687be6e0f/cluster-image-registry-operator/1.log" Mar 12 12:55:13.887150 master-0 kubenswrapper[13984]: I0312 12:55:13.884751 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-w99c4_2b840211-d5ff-4616-aa9d-50a5615e0f59/node-ca/0.log" Mar 12 12:55:14.913549 master-0 kubenswrapper[13984]: I0312 12:55:14.913491 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-vpss8_a22189f2-3f35-4ea6-9892-39a1b46637e2/ingress-operator/0.log" Mar 12 12:55:14.921830 master-0 kubenswrapper[13984]: I0312 12:55:14.921797 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-vpss8_a22189f2-3f35-4ea6-9892-39a1b46637e2/ingress-operator/1.log" Mar 12 12:55:14.932904 master-0 kubenswrapper[13984]: I0312 12:55:14.932867 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-vpss8_a22189f2-3f35-4ea6-9892-39a1b46637e2/kube-rbac-proxy/0.log" Mar 12 12:55:15.883315 master-0 kubenswrapper[13984]: I0312 12:55:15.883274 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vb6qv_bd171751-5c5c-458f-8e24-4ffde0aa2501/serve-healthcheck-canary/0.log" Mar 12 12:55:17.142051 master-0 kubenswrapper[13984]: I0312 12:55:17.141031 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bvl5s_57f59392-f9b6-4b99-ad37-33c5b7c04adb/controller/0.log" Mar 12 12:55:17.148868 master-0 kubenswrapper[13984]: I0312 12:55:17.148821 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bvl5s_57f59392-f9b6-4b99-ad37-33c5b7c04adb/kube-rbac-proxy/0.log" Mar 12 12:55:17.168248 master-0 kubenswrapper[13984]: I0312 12:55:17.168181 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/controller/0.log" Mar 12 12:55:17.249244 master-0 kubenswrapper[13984]: I0312 12:55:17.248681 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-8f89dfddd-4vzl8_68c57a64-f30c-4caf-89ef-08bd0d36833e/insights-operator/1.log" Mar 12 12:55:17.296849 master-0 kubenswrapper[13984]: I0312 12:55:17.296759 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-8f89dfddd-4vzl8_68c57a64-f30c-4caf-89ef-08bd0d36833e/insights-operator/2.log" Mar 12 12:55:18.201978 master-0 kubenswrapper[13984]: I0312 12:55:18.201929 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/frr/0.log" Mar 12 12:55:18.220996 master-0 kubenswrapper[13984]: I0312 12:55:18.220941 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/reloader/0.log" Mar 12 12:55:18.232223 master-0 kubenswrapper[13984]: I0312 12:55:18.232175 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/frr-metrics/0.log" Mar 12 12:55:18.239243 master-0 kubenswrapper[13984]: I0312 12:55:18.239184 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/kube-rbac-proxy/0.log" Mar 12 12:55:18.251707 master-0 kubenswrapper[13984]: I0312 12:55:18.248391 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/kube-rbac-proxy-frr/0.log" Mar 12 12:55:18.257003 master-0 kubenswrapper[13984]: I0312 12:55:18.256964 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/cp-frr-files/0.log" Mar 12 12:55:18.264915 master-0 kubenswrapper[13984]: I0312 12:55:18.264839 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/cp-reloader/0.log" Mar 12 12:55:18.277069 master-0 kubenswrapper[13984]: I0312 12:55:18.277022 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/cp-metrics/0.log" Mar 12 12:55:18.287821 master-0 kubenswrapper[13984]: I0312 12:55:18.287780 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-c6n7v_2ccd308f-4d04-477a-9671-ced98cf499c1/frr-k8s-webhook-server/0.log" Mar 12 12:55:18.309897 master-0 kubenswrapper[13984]: I0312 12:55:18.309847 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-787f7977d6-w89p5_a16471dc-b7aa-43ed-876d-895680bd9539/manager/0.log" Mar 12 12:55:18.318143 master-0 kubenswrapper[13984]: I0312 12:55:18.318079 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56d7584dd9-8sxrm_c0d7dfa9-7424-491d-b0a0-3543e41efe0b/webhook-server/0.log" Mar 12 12:55:18.702764 master-0 kubenswrapper[13984]: I0312 12:55:18.702706 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtt27_c6b5193f-85dd-431d-8de7-27731f2cefd4/speaker/0.log" Mar 12 12:55:18.708408 master-0 kubenswrapper[13984]: I0312 12:55:18.708356 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jtt27_c6b5193f-85dd-431d-8de7-27731f2cefd4/kube-rbac-proxy/0.log" Mar 12 12:55:19.859153 master-0 kubenswrapper[13984]: I0312 12:55:19.859110 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/alertmanager/0.log" Mar 12 12:55:19.883120 master-0 kubenswrapper[13984]: I0312 12:55:19.883082 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/config-reloader/0.log" Mar 12 12:55:19.895341 master-0 kubenswrapper[13984]: I0312 12:55:19.895306 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/kube-rbac-proxy-web/0.log" Mar 12 12:55:19.905096 master-0 kubenswrapper[13984]: I0312 12:55:19.905045 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/kube-rbac-proxy/0.log" Mar 12 12:55:19.920113 master-0 kubenswrapper[13984]: I0312 12:55:19.920069 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/kube-rbac-proxy-metric/0.log" Mar 12 12:55:19.930795 master-0 kubenswrapper[13984]: I0312 12:55:19.930771 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/prom-label-proxy/0.log" Mar 12 12:55:19.948541 master-0 kubenswrapper[13984]: I0312 12:55:19.948495 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_05e8b19e-ab9a-491a-820a-06c23194bc6d/init-config-reloader/0.log" Mar 12 12:55:20.015285 master-0 kubenswrapper[13984]: I0312 12:55:20.015243 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-674cbfbd9d-tztzr_ae2269d7-f11f-46d1-95e7-f89a70ee1152/cluster-monitoring-operator/0.log" Mar 12 12:55:20.057339 master-0 kubenswrapper[13984]: I0312 12:55:20.055266 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-krrg5_580c29d3-9f21-4949-8bfb-a7ec31f4da85/kube-state-metrics/0.log" Mar 12 12:55:20.105421 master-0 kubenswrapper[13984]: I0312 12:55:20.105375 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-krrg5_580c29d3-9f21-4949-8bfb-a7ec31f4da85/kube-rbac-proxy-main/0.log" Mar 12 12:55:20.135956 master-0 kubenswrapper[13984]: I0312 12:55:20.135760 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-68b88f8cb5-krrg5_580c29d3-9f21-4949-8bfb-a7ec31f4da85/kube-rbac-proxy-self/0.log" Mar 12 12:55:20.168052 master-0 kubenswrapper[13984]: I0312 12:55:20.167989 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-55dcbc49f6-v92w7_68e3ba35-74e7-4437-94b6-d17430d4059c/metrics-server/0.log" Mar 12 12:55:20.270580 master-0 kubenswrapper[13984]: I0312 12:55:20.270546 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-76d66fb5d5-l4mgd_fdea4a82-0757-499a-9a74-df5f79373cbe/monitoring-plugin/0.log" Mar 12 12:55:20.326566 master-0 kubenswrapper[13984]: I0312 12:55:20.326530 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-srg57_9e18a6c1-574b-4191-8672-ca05718474d6/node-exporter/0.log" Mar 12 12:55:20.360603 master-0 kubenswrapper[13984]: I0312 12:55:20.359444 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-srg57_9e18a6c1-574b-4191-8672-ca05718474d6/kube-rbac-proxy/0.log" Mar 12 12:55:20.382800 master-0 kubenswrapper[13984]: I0312 12:55:20.382725 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-srg57_9e18a6c1-574b-4191-8672-ca05718474d6/init-textfile/0.log" Mar 12 12:55:20.421070 master-0 kubenswrapper[13984]: I0312 12:55:20.420951 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-tgch2_f89782fd-6737-4e13-898d-0683549c6673/kube-rbac-proxy-main/0.log" Mar 12 12:55:20.447497 master-0 kubenswrapper[13984]: I0312 12:55:20.447435 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-tgch2_f89782fd-6737-4e13-898d-0683549c6673/kube-rbac-proxy-self/0.log" Mar 12 12:55:20.481232 master-0 kubenswrapper[13984]: I0312 12:55:20.481180 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-74cc79fd76-tgch2_f89782fd-6737-4e13-898d-0683549c6673/openshift-state-metrics/0.log" Mar 12 12:55:20.563370 master-0 kubenswrapper[13984]: I0312 12:55:20.563322 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/prometheus/0.log" Mar 12 12:55:20.598799 master-0 kubenswrapper[13984]: I0312 12:55:20.598763 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/config-reloader/0.log" Mar 12 12:55:20.620764 master-0 kubenswrapper[13984]: I0312 12:55:20.620707 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/thanos-sidecar/0.log" Mar 12 12:55:20.637598 master-0 kubenswrapper[13984]: I0312 12:55:20.637539 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/kube-rbac-proxy-web/0.log" Mar 12 12:55:20.657708 master-0 kubenswrapper[13984]: I0312 12:55:20.656197 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/kube-rbac-proxy/0.log" Mar 12 12:55:20.680626 master-0 kubenswrapper[13984]: I0312 12:55:20.680196 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/kube-rbac-proxy-thanos/0.log" Mar 12 12:55:20.700571 master-0 kubenswrapper[13984]: I0312 12:55:20.700530 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_e5e27787-de34-48dd-8854-79387c59fa6c/init-config-reloader/0.log" Mar 12 12:55:20.725534 master-0 kubenswrapper[13984]: I0312 12:55:20.725489 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5ff8674d55-mgk6g_87c8661b-4fad-4cfc-960e-5ab500121810/prometheus-operator/0.log" Mar 12 12:55:20.821506 master-0 kubenswrapper[13984]: I0312 12:55:20.821006 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5ff8674d55-mgk6g_87c8661b-4fad-4cfc-960e-5ab500121810/kube-rbac-proxy/0.log" Mar 12 12:55:20.841767 master-0 kubenswrapper[13984]: I0312 12:55:20.841711 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-8464df8497-c6pf6_1aba020f-dfec-455d-83d7-728145cc4114/prometheus-operator-admission-webhook/0.log" Mar 12 12:55:20.867322 master-0 kubenswrapper[13984]: I0312 12:55:20.866899 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d9f596674-hxrr7_790b9d8a-c3b9-41f6-8333-342e76485a8d/telemeter-client/0.log" Mar 12 12:55:20.909286 master-0 kubenswrapper[13984]: I0312 12:55:20.909242 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d9f596674-hxrr7_790b9d8a-c3b9-41f6-8333-342e76485a8d/reload/0.log" Mar 12 12:55:20.927903 master-0 kubenswrapper[13984]: I0312 12:55:20.927847 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6d9f596674-hxrr7_790b9d8a-c3b9-41f6-8333-342e76485a8d/kube-rbac-proxy/0.log" Mar 12 12:55:20.963500 master-0 kubenswrapper[13984]: I0312 12:55:20.963433 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c5678cd69-z9dpl_015b85c8-1d75-4de2-94b7-1b43b43219a0/thanos-query/0.log" Mar 12 12:55:20.976686 master-0 kubenswrapper[13984]: I0312 12:55:20.976660 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c5678cd69-z9dpl_015b85c8-1d75-4de2-94b7-1b43b43219a0/kube-rbac-proxy-web/0.log" Mar 12 12:55:20.997139 master-0 kubenswrapper[13984]: I0312 12:55:20.997093 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c5678cd69-z9dpl_015b85c8-1d75-4de2-94b7-1b43b43219a0/kube-rbac-proxy/0.log" Mar 12 12:55:21.018536 master-0 kubenswrapper[13984]: I0312 12:55:21.018453 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c5678cd69-z9dpl_015b85c8-1d75-4de2-94b7-1b43b43219a0/prom-label-proxy/0.log" Mar 12 12:55:21.043500 master-0 kubenswrapper[13984]: I0312 12:55:21.042024 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c5678cd69-z9dpl_015b85c8-1d75-4de2-94b7-1b43b43219a0/kube-rbac-proxy-rules/0.log" Mar 12 12:55:21.060022 master-0 kubenswrapper[13984]: I0312 12:55:21.059880 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-c5678cd69-z9dpl_015b85c8-1d75-4de2-94b7-1b43b43219a0/kube-rbac-proxy-metrics/0.log" Mar 12 12:55:23.264616 master-0 kubenswrapper[13984]: I0312 12:55:23.262520 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-677bd678f7-htjwl_e7256eef-5c42-4470-98da-e9ccc9d0fa7c/manager/0.log" Mar 12 12:55:23.760580 master-0 kubenswrapper[13984]: I0312 12:55:23.760419 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bvl5s_57f59392-f9b6-4b99-ad37-33c5b7c04adb/controller/0.log" Mar 12 12:55:23.775405 master-0 kubenswrapper[13984]: I0312 12:55:23.775317 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-bvl5s_57f59392-f9b6-4b99-ad37-33c5b7c04adb/kube-rbac-proxy/0.log" Mar 12 12:55:23.823174 master-0 kubenswrapper[13984]: I0312 12:55:23.823088 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/controller/0.log" Mar 12 12:55:24.074676 master-0 kubenswrapper[13984]: I0312 12:55:24.072515 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-h626x_fe570816-5558-444f-8bd8-6b5a75a80f77/manager/0.log" Mar 12 12:55:24.090177 master-0 kubenswrapper[13984]: I0312 12:55:24.090020 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-c9cfd_225a2621-3a81-48d7-bf57-8cb7355e9acf/manager/0.log" Mar 12 12:55:24.100932 master-0 kubenswrapper[13984]: I0312 12:55:24.100875 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g_815a5222-56be-4570-92a1-0cd0d1b7941d/extract/0.log" Mar 12 12:55:24.109956 master-0 kubenswrapper[13984]: I0312 12:55:24.109909 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g_815a5222-56be-4570-92a1-0cd0d1b7941d/util/0.log" Mar 12 12:55:24.116864 master-0 kubenswrapper[13984]: I0312 12:55:24.116813 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_f9f18d30af743f52483ac2b056c423e2f043de5970b22bfcfee7015477pjv7g_815a5222-56be-4570-92a1-0cd0d1b7941d/pull/0.log" Mar 12 12:55:24.246726 master-0 kubenswrapper[13984]: I0312 12:55:24.246690 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-jzwpg_f6fe9677-4309-41ac-b5dc-a6e7c95af7e8/manager/0.log" Mar 12 12:55:24.262615 master-0 kubenswrapper[13984]: I0312 12:55:24.262566 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-g5rdl_8e479d11-fd58-4636-9410-297bb6d4f88f/manager/0.log" Mar 12 12:55:24.277357 master-0 kubenswrapper[13984]: I0312 12:55:24.277256 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-9t965_b6d55477-329a-4fe6-9af5-a1146ee8844e/manager/0.log" Mar 12 12:55:24.633585 master-0 kubenswrapper[13984]: I0312 12:55:24.633489 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-b8c8d7cc8-6jzjg_ee56de80-f409-484e-87df-4a4a9b6cf52e/manager/0.log" Mar 12 12:55:24.738094 master-0 kubenswrapper[13984]: I0312 12:55:24.738035 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6bbb499bbc-vg66q_39c85f77-0a7e-4bcc-8f2c-c7801034b477/manager/0.log" Mar 12 12:55:24.829879 master-0 kubenswrapper[13984]: I0312 12:55:24.829827 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-th4r4_af967d1d-e790-4f8c-85a0-a3758a3b5f77/manager/0.log" Mar 12 12:55:24.842066 master-0 kubenswrapper[13984]: I0312 12:55:24.842003 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-68f45f9d9f-mnp48_d78fcdf2-593b-4e94-93f2-5a7091c6c2af/manager/0.log" Mar 12 12:55:24.900841 master-0 kubenswrapper[13984]: I0312 12:55:24.899782 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-658d4cdd5-mssg2_c9f726ff-a744-4adb-a508-e144b957f0c9/manager/0.log" Mar 12 12:55:24.976466 master-0 kubenswrapper[13984]: I0312 12:55:24.976328 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-7gggt_5d2c9b0f-aecf-4033-8364-d746566a5632/manager/0.log" Mar 12 12:55:25.108817 master-0 kubenswrapper[13984]: I0312 12:55:25.107631 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-569cc54c5-tlxh4_4433ab46-d919-4939-84f7-911505e17f63/manager/0.log" Mar 12 12:55:25.123557 master-0 kubenswrapper[13984]: I0312 12:55:25.120549 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-ggqzh_90572a74-c9ea-448f-a63a-a6db44987f9e/manager/0.log" Mar 12 12:55:25.149421 master-0 kubenswrapper[13984]: I0312 12:55:25.149381 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-c969dbbcd-cfj7s_8f0fd875-5848-456f-945e-bb7cabd5af6e/manager/0.log" Mar 12 12:55:25.430197 master-0 kubenswrapper[13984]: I0312 12:55:25.429638 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jgcbm_4ba27210-3da6-4946-a322-9f2d3bed28e0/frr/0.log" Mar 12 12:55:25.447936 master-0 kubenswrapper[13984]: I0312 12:55:25.447652 13984 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-65b9994cf8-fg2sk_a3bf35a3-098b-458a-9482-e7946bec4917/operator/0.log"